I was recently fiddling around the code for a while and decided to create a tutorial out of it so that others may find it useful. If I would have known this knowledge long back, I am sure I would have saved a lot of time since then! I find lot of beginners getting swayed and lack of knowhow on MavLink with respect to APM/PX4. Information on internet is scattered and not much of use!!
This will be "Step by Step" entertaining, PART - I of the series I plan to write.on:
What it Covers:
- MavLink, starting from scratch. What the hell is it and understand how it works with APM/ PX4
- Learn how developers think -> Arducopter communication with Mission Planner and vice-versa.
- Get a feel of 'How Stuffs works'.
Too much hype:) Well, this information has been collated from my experience and from internet. I know there is information on the new Wiki, it tells you what, I plan to tell you 'how'! :)
Please let me know if you found it interesting. If there is enough response, I will make another tutorial where I would add more 'Step by step' knowhow!!
- More on MavLink
- Learning Arducopter source code, Step by Step
- Making swarm (multiple) copters work with your 3DR Telemetry radio! I am working on it.
See attachment for [MAVLink Tutorial for Absolute Dummies (Part –I)]
Edit: Request you to post your queries over the forum directly, as it is not possible for me to address all queries I get by email!
Hi Shyam , Nice post . Really good work. I'm learning Arducotper arduino code..
If you can help me with the step by step code understanding of Arducopter code it would be great.
If you have any links to your blog please share that too
@Ronello, I am glad that the article I wrote over 3 years ago is still being found useful.
I received your message. There are two software stacks commonly in use today: a) Ardupilot (APM) software stack and b) PX4 software stack
a) was initially made for the APM flight controller, but now a) can be ported to Pixhawk flight controller as well. All the firmware from DIYDrones community is using the Pixhawk hardware. However b) is only made for dedicated Pixhwak hardware.
Now, to answer your question:
1) GCS_Mavlink.pde and the Arducopter.pde are the C++ files part of the ArduPilot software stack. So, if you are using Pixhawk with ArduPilot software stack (which I believe you are), you can find in the link below (PDE and CPP are pretty much the same (don't mind the extension):
Everything you need in the firmware is in the "Arducopter" folder structure and the support libraries are one directory up, in the "library" folder.
2) real-time x-y-z, roll-pitch-yaw, altitude etc etc are continuously sent from the Pixhawk to the Ground Control station (e.g. Mission Planner). Focus on the HEARTBEAT messages which are sent to each other, every 1 second to keep each node's connection alive - start from there. The frequency of "attitude (roll/pitch/yaw)" is much higher - say 50 times a second (higher because - when you move your Pixhwak, you can see the orientation moving in the HUD (Head Up Display) in the Mission Planner).
3) Is it possible for me to use these data to be taken as input to MATLAB as variables?: YES you can just read the incoming whatever (attitude, GPS data, or anything else coming in from your Pixhawk) in the code and do anything with it OR otherwise if you don't wish to do that then there is already some inbuilt MATLAB support on Mission Planner - but you must check this on the ardupilot dev guide AND/OR I found something here:
There is also something called MATMAV for Pixhawk/MissionPlanner combo that you can use (however, I have never used it). Google is your friend!
Hope my answers help you a bit!
Ronello Ninettu said:
OMG THANK YOU FOR THIS! I'M A COMPLETE DUMMY AND REALLY APPRECIATE THIS! THANKS!
hopefully dummy for now lol
I am using 'jmavsim'. Yes, I have set different system ids for the drones using 'MAV_SYS_ID'.
My actual task is to get the UAV to follow a car. There is a Car-to-X interface which sends the the GPS co-ordinates of the car to the UAV. My idea of simulating two drones was to emulate the behaviour of the car and the UAV. So, my first drone will be the car following a set mission and the second drone would be the UAV which follows based on the GPS data it receives. So basically the simulation would be a test platform to check the implementation before flying the actual UAV.
I am not sure if there is a better way to test this.
Shyam Balasubramanian said:
That's a cool project. Well, this is what I think:
1) When you are doing swarming, are communicating directly between the two vehicles (two virtual Pixhawks) OR do you want to the qGCS to be the master, commanding the two vehicles?
2) May I ask, which SITL platform are you using, Gazebo?
3) I assume you have assigned different 'System Ids' to the two vehicles, am I right?
1) qGCS is only a display which shows you what is happening within the vehicle (Pixhawk). It can also send commands to the vehicle to change its state (e.g. ARMING or sending mission WPs). But, when the mission is on, it is only receiving from Pixhawk and showing you data about the flight.
2) There are many approaches to do this, one of the good ones I see is:
The master vehicle may need to send MAV_CMD_DO_FOLLOW mavlink command to the slave vehicle and the slave merely follows what the master says.
Master: Send MAV_CMD_DO_FOLLOW mavlink message, after encoding the data into FOLLOW_TARGET struct.
Slave: Receive MAV_CMD_DO_FOLLOW message, and decode the data from FOLLOW_TARGET struct.
Slave will publish the follow_me object (in Navigator module) and need to do some magic with it. I don't think anyone is using it yet.
https://github.com/PX4/Firmware/tree/master/src/modules/navigator (see follow_target files).
See all Mavlink msgs at:
Focus on the Mavlink module and see how other messages are being sent and received.
3) You will see a file:
This file has certain parameters QGC receives from Pixhawk (and can be set by user in QGC). See, QGC->Settings tab->Parameters and search for any of the listed parameters (e.g. NAV_FT_DST - distance to follow target from). These parameters can be constant (set and forget).
4) GCS will merely show what is happening at the two vehicles.
Disclaimer: Well, the above was a quick idea I thought in about 30 minutes but I may be wrong. Please analyze well before you start to implement. I was only showing you a possible way to do this smoothly ;)
Anjum Shariff said:
I was currently trying to simulate two drones using PX4 SITL. I am able to get two drones on a map in qGCS, plan missions and fly.
What I would like to achieve is , plan missions for the first drone and get the second one to follow it. Do I need to make changes on the PX4 side or the qGCS ? Could you please help me with this ?
Thanks for your comments. Probably you would need to implement a vision based (or with other sensors) algorithm which detects the movement of the car and convert the detection into certain 'custom' Mavlink commands and send it to your Pixhawk. The companion computer, like Raspberry Pi can do this job for you (with Vision algorithm + custom Mavlink sending). The Pixhawk will then need to interpret this mavlink message and do something with it (like moving the vehicle).
In Pixhawk, there are two files you need to focus on:
1) mavlink_messages.cpp (to send Mavlink msgs to the GCS or a companion computer like RPi) and
2) mavlink_receiver.cpp (receive messages coming in from GCS/ or a companion computer like RPi)
In PX4, it is module based, i.e. you have a number of applications (one of them is Mavlink) that run parallelly in the background in Nutt (real-time) OS. All modules are made in such a way that they can communicate with each other in a "Publish and Subscribe" fashion. Any module interested in the other can subscribe to it (in reality, a module subscribes to a class) but this is just to lay out an understanding for you. Its fairly easy, if you get the jist of this. In each module, you will find an entry point 'Main'. That is where you start.
For e.g. the module Commander, subscribes to sensor data (and can regularly look up if there is a new data available to consume) in its (so-called) never-ending while loop. Each module (application) runs at a particular loop frequency, based on how important that module is (priority in NuttX OS). E.g. the module 'Commander' (as the name suggests) has the MAX priority ;)
Look up, https://github.com/PX4/Firmware/tree/master/src/modules
Thank you for such an amazing tutorial. I am pretty new to UAV's and everything related to it. I am working on a project where I am planning to implement a follow algorithm for the UAV to follow a car. I would be working on PX4 firmware. The communication between GCs and UAV is via MAVLink. It would be really helpful if you could give any pointers on how to go about this. Currently I am just looking at the PX4 source code and I am not really progressing much.
Shyam Balasubramanian said:
Hello Shyam, et al.,
I would simply like to create Arduino code to emulate MAVLink messages sent to a FLIR Vue Pro camera to write to EXIF metadata. My understanding is that the camera does not send MAVLink messages, rather just listens for them and processes them when they arrive. I'm hoping not to have to re-invent the wheel, as they say. Has anyone done this or have knowledge of how the FLIR Vue Pro gets MAVLink messages from PixHawk?
Sorry guys, I have been away from the forums for a while. I know most of your questions are dated back to months ago. Now that I am active, I will try to answer the questions that may come up :)