Hello guys!
I am making for my final year project a quad-copter that will detect an object at a specific longitude and latitude. I am using raspberry pi for image processing and pixhawk for the control.
As you know the the GPS is not accurate enough and it might give an error of 1 + meters. For that, the logic behind the project is to let the quad-copter go first to the predefined lon/lat of the object then try to find the target.
My question is the following: I know that I can interface the raspberry pi and pixhawk using Mavlink. But is there any command in Mavlink that allows the raspberry pi to send a move left command for x meters...to the pixhawk?
Replies
Hi,
I am doing a similar project. Have you found anythin realted to it. I have been succesful in dsetting up the connection between raspberry pi and pixhawk. How would i get gps data from pixhawk to raspberry pi??
Any help appreciated
Thanks & Regards
Nicolas-Michael El Jamal said:
Hey did you get this to work? I'm also looking for a quick fix like this without having to get into the whole dronekit documentation for my line follower project. Please have a look at what I'm trying to do here: http://diydrones.com/forum/topics/line-follower-project-using-a-mac...
Hi Nicholas!
How's this project of yours now? I am working on a similar one. I needed to know how to send commands to the pixhawk to move "x" meters forward or sideways. How were you able to do this? By the way, I am also using python.
Hi All
Apologies for my absence. I have had to take a month out of the project, but now can get back into it.
I have got as far as successfully sending some limited commands from a C++ program to the PixHawk (rather than from the command screen/monitor).
Next steps are to expand the repertoire of commands able to be sent and to receive messages back from the PixHawk.
Has anyone else made progress?
Mike,
I believe that the best and most simple solution is to run mission planner software (MAVProxy) on the raspberry pi and writing a script that will send emulated RC input to the PixHawk via serial/telemetry link based on the OpenCV image processing. MAVProxy has a relatively simple way of sending commands to manipulate variables seen in the "Status" tab of Mission Planner via python, which is a lot easier to build than C/C++.
Seems like the way I may go,
Logan
Hi Logan,
Any progress that you made with above route?
Very interested to know the performance of Mission Planner running on Pi 2 meanwhile also analyzing the image using OpenCV..
And how do you receive the video input from the drone? using Gtreamer? or simple video transmitter (5.8ghz, etc) and feed it into Pi 2?
Thanks for sharing.
Waladi
Hey All,
I also apologize for my absence, I took also a few months out of the project.
I decided finally to use python to send the movement commands to the pixhawk (via MAVproxy) and I will extend my python module with C++ (OpenCV) using boost.
I finished testing my modules individually. However today, I connected the Raspberry Pi with the Pixhawk according to the link below:
http://dev.ardupilot.com/wiki/companion-computers/raspberry-pi-via-...
I followed the tutorial and I was powering my Pixhawk from my laptop's USB port. When I ran MAVproxy on the Raspberry pi I got garbage. Do you think that's because I am powering Pixhawk from my laptop's USB?
I recommend that you use python and extend it with C++ because it is much easier. dronekit.io is a good starting point.
Thank you!
Cheers
Hey Nicolas-Michael!
I am making pretty same project as you do, and I am also trying to deal with same problem of controlling PixHawk from on-board computer.
The solution which I am trying to implement now is like this:
I am connecting via MavProxy to PixHawk and I am sending MavLink command MANUAL_CONTROL which overrides the RC input. Here is the description of this command:
https://pixhawk.ethz.ch/mavlink/#MANUAL_CONTROL
I would be grateful if somebody would share his experience using this method.
Hi Sergey
It may be the answer lies with the instructions here: http://www.qgroundcontrol.org/dev/mavlink_onboard_integration_tutorial
With this, it should be possible to build some software on the Pi to directly send Mavlink commands. The Pi can then also run OpenCV for tracking etc. and provide the 'intelligence' required.
Unless someone has a better idea (which would be very welcome!), I'm going to take that route for now...
Cheers
Mike
Hi Everyone
I am working on a Masters project doing something similar, so wanting to control the Pixhawk using Mavlink commands sent from a Raspberry Pi.
I have the two boards communicating but need to decide what to do next. As MavProxy is a command line application, I'm wondering if it is not suited to be adapted to give a continuous stream of messages to the Pixhawk. Is it best to build a program on the Raspberry Pi from scratch?
Any thoughts anyone?
Cheers
Mike