Pixhawk & raspberry pi

Hello guys!

I am making for my final year project a quad-copter that will detect an object at a specific longitude and latitude. I am using raspberry pi for image processing and pixhawk for the control.
As you know the the GPS is not accurate enough and it might give an error of 1 + meters. For that, the logic behind the project is to let the quad-copter go first to the predefined lon/lat of the object then try to find the target.
My question is the following: I know that I can interface the raspberry pi and pixhawk using Mavlink. But is there any command in Mavlink that allows the raspberry pi to send a move left command for x the pixhawk?  

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –


  • Hi,

    I am doing a similar project. Have you found anythin realted to it. I have been succesful in dsetting up the connection between raspberry pi and pixhawk. How would i get gps data from pixhawk to raspberry pi??

    Any help appreciated

    Thanks & Regards

  • Nicolas-Michael El Jamal said:

    Hey Sergey, 

    It's good to know that I am not the only one stuck with this problem.

    I managed to successfully send commands to the pixhawk via Mavproxy. 

    This is my sequence of commands after running according to this thread ( :

    switch 0

    rc 3 1500 //(throttle)

    arm throttle

    These commands will activate the motors, so be careful and remove your propellers. Also, you can send yaw, pitch and roll commands by changing the number in the 2nd command. e.g: rc i 1600; i= 1, 2, 3, 4.

    Thank you & good luck!


    Hey did you get this to work? I'm also looking for a quick fix like this without having to get into the whole dronekit documentation for my line follower project. Please have a look at what I'm trying to do here:

  • Hi Nicholas!

    How's this project of yours now? I am working on a similar one. I needed to know how to send commands to the pixhawk to move "x" meters forward or sideways. How were you able to do this? By the way, I am also using python.

  • Hi All

    Apologies for my absence.  I have had to take a month out of the project, but now can get back into it.

    I have got as far as successfully sending some limited commands from a C++ program to the PixHawk (rather than from the command screen/monitor).

    Next steps are to expand the repertoire of commands able to be sent and to receive messages back from the PixHawk.

    Has anyone else made progress?

    • Mike,

      I believe that the best and most simple solution is to run mission planner software (MAVProxy) on the raspberry pi and writing a script that will send emulated RC input to the PixHawk via serial/telemetry link based on the OpenCV image processing. MAVProxy has a relatively simple way of sending commands to manipulate variables seen in the "Status" tab of Mission Planner via python, which is a lot easier to build than C/C++.

      Seems like the way I may go,


      • Hi Logan,

        Any progress that you made with above route?

        Very interested to know the performance of Mission Planner running on Pi 2 meanwhile also analyzing the image using OpenCV..

        And how do you receive the video input from the drone? using Gtreamer? or simple video transmitter (5.8ghz, etc) and feed it into Pi 2?

        Thanks for sharing.


      • Moderator

        Hey All,

        I also apologize for my absence, I took also a few months out of the project.

        I decided finally to use python to send the movement commands to the pixhawk (via MAVproxy) and I will extend my python module with C++ (OpenCV) using boost. 

        I finished testing my modules individually. However today, I connected the Raspberry Pi with the Pixhawk according to the link below:

        I followed the tutorial and I was powering my Pixhawk from my laptop's USB port. When I ran MAVproxy on the Raspberry pi  I got garbage. Do you think that's because I am powering Pixhawk from my laptop's USB?

        I recommend that you use python and extend it with C++ because it is much easier.  is a good starting point.

        Thank you!


  • Hey Nicolas-Michael!

    I am making pretty same project as you do, and I am also trying to deal with same problem of controlling PixHawk from on-board computer.

    The solution which I am trying to implement now is like this:
    I am connecting via MavProxy to PixHawk and I am sending MavLink command MANUAL_CONTROL which overrides the RC input. Here is the description of this command:

    I would be grateful if somebody would share his experience using this method.

    • Hi Sergey

      It may be the answer lies with the instructions here:

      With this, it should be possible to build some software on the Pi to directly send Mavlink commands.  The Pi can then also run OpenCV for tracking etc. and provide the 'intelligence' required.

      Unless someone has a better idea (which would be very welcome!), I'm going to take that route for now...



    • Hi Everyone

      I am working on a Masters project doing something similar, so wanting to control the Pixhawk using Mavlink commands sent from a Raspberry Pi.

      I have the two boards communicating but need to decide what to do next.  As MavProxy is a command line application, I'm wondering if it is not suited to be adapted to give a continuous stream of messages to the Pixhawk.  Is it best to build a program on the Raspberry Pi from scratch? 

      Any thoughts anyone?



This reply was deleted.


DIY Robocars via Twitter
How to use the new @donkey_car graphical UI to edit driving data for better training
DIY Robocars via Twitter
RT @SmallpixelCar: Wrote a program to find the light positions at @circuitlaunch. Here is the hypothesis of the light locations updating ba…
DIY Robocars via Twitter
RT @SmallpixelCar: Broke my @HokuyoUsa Lidar today. Luckily the non-cone localization, based on @a1k0n LightSLAM idea, works. It will help…
DIY Robocars via Twitter
@gclue_akira CC @NVIDIAEmbedded
Nov 23
DIY Robocars via Twitter
RT @luxonis: OAK-D PoE Autonomous Vehicle (Courtesy of zonyl in our Discord:
Nov 23
DIY Robocars via Twitter
RT @f1tenth: It is getting dark and rainy on the F1TENTH racetrack in the @LGSVLSimulator. Testing out the new flood lights for the racetra…
Nov 23
DIY Robocars via Twitter
RT @JoeSpeeds: Live Now! Alex of @IndyAChallenge winning @TU_Muenchen team talking about their racing strategy and open source @OpenRobotic…
Nov 20
DIY Robocars via Twitter
RT @DAVGtech: Live NOW! Alexander Wischnewski of Indy Autonomous Challenge winning TUM team talking racing @diyrobocars @Heavy02011 @Ottawa…
Nov 20
DIY Robocars via Twitter
Incredible training performance with Donkeycar
Nov 9
DIY Robocars via Twitter
RT @JoeSpeeds: Sat Nov 6 Virtual DonkeyCar (and other cars, too) Race. So bring any car? @diyrobocars @IndyAChallenge…
Oct 31
DIY Robocars via Twitter
RT @JoeSpeeds: @chr1sa awesomely scary to see in person as our $1M robot almost clipped the walls as it spun at 140mph. But it was also awe…
Oct 29
DIY Robocars via Twitter
RT @chr1sa: Hey, @a1k0n's amazing "localize by the ceiling lights" @diyrobocars made @hackaday! It's consistently been the fastest in our…
Oct 25
DIY Robocars via Twitter
RT @IMS: It’s only fitting that @BostonDynamics Spot is waving the green flag for today’s @IndyAChallenge! Watch LIVE 👉…
Oct 23
DIY Robocars via Twitter
RT @IndyAChallenge: Congratulations to @TU_Muenchen the winners of the historic @IndyAChallenge and $1M. The first autonomous racecar comp…
Oct 23
DIY Robocars via Twitter
RT @JoeSpeeds: 🏎@TU_Muenchen #ROS 2 @EclipseCyclone #DDS #Zenoh 137mph. Saturday 10am EDT @IndyAChallenge @Twitch
Oct 23
DIY Robocars via Twitter
RT @DAVGtech: Another incident:
Oct 23