APM adaptive flying/video analysis waypoint help

I have a quick question on sending waypoints through the APM planner. I am working on a project using OpenCV to do live video analysis of FPV feeds in order to look for "targets" for a competition, i.e. post-processing the video on a computer and determining course of action from the results. From there, I have to modify the flight plan in-flight to acquire the next target, retreive data, such as the GPS data at the point where the plane identified the target, from the planner, and print the data out to a txt document. My question is; is there any way I can retrieve this data from the planner and send waypoints through the planner, such as an API? I would like to keep the APM planner in the loop, as it works really well, and I would rather not cut it out. I know I COULD make my own custom program/gui that gets the data I need over serial, but if I can avoid it, please, let me know. Does anyone know of a way to get the values out of the planner that I need, and a way to pass waypoints to the plane through the planner? I really only want to interface the video-processing program and the planner, basically just to send commands and get values. Let me know what you think I should do! Thanks!

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –

Replies

  • Hi,

    Thank you so much .. It was an useful discussion. But , what happened after that? Eere you able to retreive  and to send the GPS  values through the planner software??

    • As it turns out, using Mavproxy is a much simpler option. I just created my own "module" (plugin) for Mavproxy, and use it to send any values that are sent through mavlink from my copter through to the network. Mavproxy also has a ton of output capability as well, so you can still keep mission planner as a front end and just put Mavproxy in the middle. Mavproxy also allows you to send commands. For example, as of right now, I have the ability coded in a custom Mavproxy "module" (plugin) to allow me to send my copters to a specific waypoint, specific location, or even move it over a few meters in a direction. I can also tell them to land and what action to take when they arrive at a location. If you need help or example code for MavProxy, let me know! It is fairly confusing, and took me quite a few hours to figure out, but it is definitely worth all the work. It's become a great tool. And it can be run on onboard Raspberry Pi's, which is a huge plus. Let me know if this helps!

      • Hi,

        I was very interested in you work,can you give me some Mavproxy example code?

        Thank you!

  • thanks you all for the interesting work, i am at the beginning of a similar project and will post and share my experiences as soon i am testing out in the field.

  • Thank you all very much! I took a break from this project over spring break, and just returned to it, and getting the values out of the planner through a python UDP script works perfectly :) Just make sure you use python 2.7.3, not 3.0 or higher. I had some problems with the newer versions not working right. Thanks again, I'll keep you posted on how it goes!!!

  • Developer

    apm planner support a very simple guided mode function mode via www.

    eg

    http://127.0.0.1:56781/guided?lat=-34&lng=117.8&alt=30

    this will send a guided mode wp to the connected mav.

This reply was deleted.

Activity

DIY Robocars via Twitter
RT @TinkerGen_: "The Tinkergen MARK ($199) is my new favorite starter robocar. It’s got everything — computer vision, deep learning, sensor…
Monday
DIY Robocars via Twitter
Monday
DIY Robocars via Twitter
RT @roboton_io: Join our FREE Sumo Competition 🤖🏆 👉 https://roboton.io/ranking/vsc2020 #sumo #robot #edtech #competition #games4ed https://t.co/WOx…
Nov 16
DIY Drones via Twitter
First impressions of Tinkergen MARK robocar https://ift.tt/36IeZHc
Nov 16
DIY Robocars via Twitter
Our review of the @TinkerGen_ MARK robocar, which is the best on the market right now https://diyrobocars.com/2020/11/15/first-impressions-of-tinkergen-mark-robocar/ https://t.co/ENIlU5SfZ2
Nov 15
DIY Robocars via Twitter
RT @Ingmar_Stapel: I have now explained the OpenBot project in great detail on my blog with 12 articles step by step. I hope you enjoy read…
Nov 15
DIY Robocars via Twitter
RT @DAVGtech: This is a must attend. Click the link, follow link to read the story, sign up. #chaos2020 #digitalconnection #digitalworld ht…
Nov 15
DIY Robocars via Twitter
RT @a1k0n: Got a new chassis for outdoor races (hobbyking Quantum Vandal) but I totally didn't expect that it might cause problems for my g…
Nov 11
DIY Drones via Twitter
First impressions of the Intel OpenBot https://ift.tt/36qkVV4
Nov 10
DIY Robocars via Twitter
Nov 9
DIY Robocars via Twitter
Excellent use of cardboard instead of 3D printing! https://twitter.com/Ingmar_Stapel/status/1324960595318333441
Nov 7
DIY Robocars via Twitter
RT @chr1sa: We've got a record 50 teams competing in this month's @DIYRobocars @donkey_car virtual AI car race. Starting today at 10:00am…
Nov 7
DIY Robocars via Twitter
Nov 6
DIY Robocars via Twitter
RT @a1k0n: Car's view, using a fisheye camera. The ceiling light tracking algorithm gave me some ideas to improve ConeSLAM, and having grou…
Nov 5
DIY Robocars via Twitter
RT @a1k0n: To get ground truth I measured the rug, found the pixel coordinates of its corners, calibrated my phone camera with my standard…
Nov 5
DIY Robocars via Twitter
RT @a1k0n: @DIYRobocars is back in December, but outside. Time to reinvestigate ConeSLAM! I rigged up a quick and dirty ground-truth captur…
Nov 5
More…