MavLink Protocol for on board Computer

Hi,

I am looking into hosting an external computer/microcontroller on board a copter with an APM. The goal is to provide vision assistance for the sparkfun AVC competition. I have looked into the commands and some arduino Mavlink examples, but I am still struggling with the implementation.

The idea is to have the APM run an autonomous mission but if a waypoint is specified(by the user) as a vision assisted waypoint it will look for an object while flying towards it. If an object is detected then the external computer will request to control the APM. If given control it will enter altitude hold mode, emulate RC controls, and fly the copter until the object is lost. Once the object is lost(hopefully popped in this case), the computer will start the mission where it left off (increment a waypoint), and give control back to the APM. 

The external computer needs to handle the following Mavlink commands:

Send a heart beat

Request pitch of craft

Request altitude of craft

Request Current mission status(the current action the APM is autonomously executing)

Request control

Change flight mode

Send RC Commands

Start mission at specified point

If I read the documentation properly all these commands are supported. I am just not sure how to implement them in arduino. I only understand the heartbeat. 

Does anyone have any advice or some example code involving using an external computer to control the APM/Pixhawk over Mavlink? Or is there a way to see Mavlink packets that are being sent between the ground station and APM? 

The computer can currently track objects and has the logic to emulate RC control if you're wondering.

Thanks,

Daniel

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –

Replies

  • Hey Daniel.
    I recognized, meanwhile you realized the precision landing on another type of companion computer.

    I am working on a precision landing procedure as well (unfortunately I just noticed your project this week). I am currently working on some issues you might fixed before.

    The Arduino sends the RC commands to the Pixhawk using the MAVLink message rc_channels_override. I read that the override is only possible in Guided mode, but my tests with ArduCopter V.3.2.1 show, that Pixhawk also gets controlled by these commands in Stabilize or Alt_Hold mode. I want to switch between a "controlled-by-arduino"- and "controlled-by-remote-control"-mode. I don´t wan´t change the Pixhawk flight control stack. Is is possible to simply parametrize Pixhawk / the flight modes for this task?

    Thanks,
    Stephan

  • Is there a way for me to get sample code from you?

  • Yeah I went with a higher level software called droneAPI and used it with mavproxy and python scripting. Hope that helps.
    • Raspberry pi. I would recommend something more powerful if you plan to do vision tracking. I maxed out my raspberry pi while it was over clocked to 1gHz. It was only able to achieve a speed of 2hz image processing rate. This also caused drone api to malfunction. Didnt have time to find out why it malfunctioned. I only know it happened at heavy processor and memory usage.

      Note: My code wasn't optimized but it was definitely pushing its limits.

      For something more powerful look at the odroid u3, banana pi, or hummingbird.
    • What kind of onboard computer do you use?

  • We work on that topic, any advance?

This reply was deleted.

Activity

DIY Robocars via Twitter
RT @TinkerGen_: "The Tinkergen MARK ($199) is my new favorite starter robocar. It’s got everything — computer vision, deep learning, sensor…
Monday
DIY Robocars via Twitter
Monday
DIY Robocars via Twitter
RT @roboton_io: Join our FREE Sumo Competition 🤖🏆 👉 https://roboton.io/ranking/vsc2020 #sumo #robot #edtech #competition #games4ed https://t.co/WOx…
Nov 16
DIY Drones via Twitter
First impressions of Tinkergen MARK robocar https://ift.tt/36IeZHc
Nov 16
DIY Robocars via Twitter
Our review of the @TinkerGen_ MARK robocar, which is the best on the market right now https://diyrobocars.com/2020/11/15/first-impressions-of-tinkergen-mark-robocar/ https://t.co/ENIlU5SfZ2
Nov 15
DIY Robocars via Twitter
RT @Ingmar_Stapel: I have now explained the OpenBot project in great detail on my blog with 12 articles step by step. I hope you enjoy read…
Nov 15
DIY Robocars via Twitter
RT @DAVGtech: This is a must attend. Click the link, follow link to read the story, sign up. #chaos2020 #digitalconnection #digitalworld ht…
Nov 15
DIY Robocars via Twitter
RT @a1k0n: Got a new chassis for outdoor races (hobbyking Quantum Vandal) but I totally didn't expect that it might cause problems for my g…
Nov 11
DIY Drones via Twitter
First impressions of the Intel OpenBot https://ift.tt/36qkVV4
Nov 10
DIY Robocars via Twitter
Nov 9
DIY Robocars via Twitter
Excellent use of cardboard instead of 3D printing! https://twitter.com/Ingmar_Stapel/status/1324960595318333441
Nov 7
DIY Robocars via Twitter
RT @chr1sa: We've got a record 50 teams competing in this month's @DIYRobocars @donkey_car virtual AI car race. Starting today at 10:00am…
Nov 7
DIY Robocars via Twitter
Nov 6
DIY Robocars via Twitter
RT @a1k0n: Car's view, using a fisheye camera. The ceiling light tracking algorithm gave me some ideas to improve ConeSLAM, and having grou…
Nov 5
DIY Robocars via Twitter
RT @a1k0n: To get ground truth I measured the rug, found the pixel coordinates of its corners, calibrated my phone camera with my standard…
Nov 5
DIY Robocars via Twitter
RT @a1k0n: @DIYRobocars is back in December, but outside. Time to reinvestigate ConeSLAM! I rigged up a quick and dirty ground-truth captur…
Nov 5
More…