Hello guys!

I am making for my final year project a quad-copter that will detect an object at a specific longitude and latitude. I am using raspberry pi for image processing and pixhawk for the control.
As you know the the GPS is not accurate enough and it might give an error of 1 + meters. For that, the logic behind the project is to let the quad-copter go first to the predefined lon/lat of the object then try to find the target.
My question is the following: I know that I can interface the raspberry pi and pixhawk using Mavlink. But is there any command in Mavlink that allows the raspberry pi to send a move left command for x meters...to the pixhawk?  

Views: 13593

Reply to This

Replies to This Discussion

Hi!

If you plan to make an autonomous quad controlled via raspberry, you have several ways, one of them is using the Rpi itself as a stabilization board. Of course it is simplier just to "control a controller", using as interface other boards. I sugest to acquire a KK Multicopter board, wich is the most simply Multicopter board, wiring the channel pins to 4 output pins on the Rpi, or controlling it via USB serial.

If you are still interested in use together  Rpi and PixHawk (in my opinion it is redundant), you could connect them via Bluetooth or even with the gpio pins as receiver, or conecting via serial

Peluzza,

I am trying to use my rpi to transmit video via ip and the pixhawk for autopilot, but my question is: how can I connect the Rpi and Pixhawk with the GPIO pins or via serial? What do I need to do it (cables, converters, etc.)? Any guidelines or a tutorial you recommend?

I've heard of an ethernet to serial converter, but I don't know how that works.

Any thoughts?

Thanks!!

For FPV video you won't need any adapter, just a video transmiter like this one : http://store.3drobotics.com/products/3dr-fpv-osd-kit . The advantage is pretty clear, Long distance video streaming.

If you plan to simply transmit video from Rpi to base, there is no need to connect it to the PxH, just follow this tutorial about stream via wireless: http://blog.miguelgrinberg.com/post/stream-video-from-the-raspberry...

To simply connect Rpi and PxH via serial, it is well documented here: http://dev.ardupilot.com/wiki/raspberry-pi-via-mavlink/

As I said, Pixhawk is enough powerfull, unless you need to make an a fully autonomous long distance drone. Why? because nowadays there's no need of a full brain on board, weight is a capital factor in quadcopters. You can program a powerfull base station avoiding carry it on board.

Hope it helps.

Hi!

Thanks for your response!

I am connecting the raspberry pi and pixhawk according to the following tutorial:

http://dev.ardupilot.com/wiki/raspberry-pi-via-mavlink/

GPIO--> Telemetry

The main problem that I am having is the fact that I am using C/C++ for image processing and I am trying to track an object. The raspberry pi is taking care, of course, of the image processing, and will send movements commands to the pixhawk (such as move left, right... ) according to the image processed. 

The main problem is that I am not understanding how should I send these movement commands to the Pixhawk. I know that I should use either Mavlink or Mavproxy but i dont know how...

Appreciate your help

Hey Nicolas-Michael!

I am making pretty same project as you do, and I am also trying to deal with same problem of controlling PixHawk from on-board computer.

The solution which I am trying to implement now is like this:
I am connecting via MavProxy to PixHawk and I am sending MavLink command MANUAL_CONTROL which overrides the RC input. Here is the description of this command:

https://pixhawk.ethz.ch/mavlink/#MANUAL_CONTROL

I would be grateful if somebody would share his experience using this method.

Hey Sergey, 

It's good to know that I am not the only one stuck with this problem.

I managed to successfully send commands to the pixhawk via Mavproxy. 

This is my sequence of commands after running mavproxy.py according to this thread (http://dev.ardupilot.com/wiki/raspberry-pi-via-mavlink/) :

switch 0

rc 3 1500 //(throttle)

arm throttle

These commands will activate the motors, so be careful and remove your propellers. Also, you can send yaw, pitch and roll commands by changing the number in the 2nd command. e.g: rc i 1600; i= 1, 2, 3, 4.

By the way, do you know anything about the problem that I am facing with the vibrations? (video)

Thank you & good luck!

NJ

Hey Nicolas-Michael,

Thank you for your response!

I managed to send those commands. And it work!

But my aim is to control the quad on a lower level via MAVLink commands (like MANUAL_CONTROL, RC_CHANNELS_OVERRIDE..)

And about vibration of the video. How you mount you camera?

You could some foam material between your quad fram and camera it could absorb some of the vibrations. Or you could use more professional camera gimbal.

Good luck!

Sergey

Hi Everyone

I am working on a Masters project doing something similar, so wanting to control the Pixhawk using Mavlink commands sent from a Raspberry Pi.

I have the two boards communicating but need to decide what to do next.  As MavProxy is a command line application, I'm wondering if it is not suited to be adapted to give a continuous stream of messages to the Pixhawk.  Is it best to build a program on the Raspberry Pi from scratch? 

Any thoughts anyone?

Cheers

Mike

Hi Peluzza

I see your point, but I am taking the same approach of using a PxH and Pi together.  Partly this is to keep the overall system modular and partly because I prefer to adapt the APM code as little as possible - it's quite complex and I do not want to introduce errors through my own ineptitude!

The issue for me is whether Mavproxy can be adapted to send the commands required on a continual basis, or whether it's best to program from scratch, using the Mavlink libraries and header generators provided.

Thanks for your input,

Mike

Hi Sergey

It may be the answer lies with the instructions here: http://www.qgroundcontrol.org/dev/mavlink_onboard_integration_tutorial

With this, it should be possible to build some software on the Pi to directly send Mavlink commands.  The Pi can then also run OpenCV for tracking etc. and provide the 'intelligence' required.

Unless someone has a better idea (which would be very welcome!), I'm going to take that route for now...

Cheers

Mike

Hey Mike, 

Well as you replied to Sergey, you can use the mavlink onboard integration tutorial to achieve your goal. However, since I am on a tight schedule I am willing to send commands via Mavproxy. Using my opencv C/C++ code for image processing, I will be executing bash commands using the "system" function of C. With this, after getting the results of the image processing, I will run mavproxy.py and will send commands accordingly to the pixhawk (such as rc 1 2000, etc.).

I know this might be a bad idea. I didn't try it yet, but I am sure that it will work. It is just an easy fix since I have to present my project at the end of May. 

Good luck with your Project!

Nicolas

Hi Nicolas

I see what you mean.  I have a little longer and will try and build on the other tutorial at http://qgroundcontrol.org/dev/mavlink_linux_integration_tutorial.

I am also interested in integrating OpenCV at some stage, but that I think is 6 months or so away and by then I hope to have in place a robust, if simple, program on the Pi to request data from various sources, process it and then send a variety of Mavlink messages to the Pixhawk.

Good luck and keep us updated!

Regards

Mike

Reply to Discussion

RSS

© 2018   Created by Chris Anderson.   Powered by

Badges  |  Report an Issue  |  Terms of Service