Formation Flying Mesh Network Demo

https://www.youtube.com/watch?v=oH9C43To3Dk

A few co-workers and I at NASA's Marshall Space Flight Center (MSFC) have been working on a small formation flying project over the last 9 months.  Our project is to demonstrate formation flying using a decentralized mesh network first by flight testing the concept with small UASes (nearly standard 3DR quads), and then simulating the same system being used on satellites.  We had a very successful demo flight last week with 5 quads flying, and I thought now would be a great time to show our results and share our work with the sUAS community.

We've been using 3DR UAS systems for several years now on several projects including an internal NASA search and rescue unmanned system competition and to provide aerial footage of test flights of the Mighty Eagle lunar lander testbed at MSFC.  

For this new project, we took the standard quad frame and electronics with Pixhawks as the flight computer and added to that a Beaglebone Black to run our formation logic and a pair of XBee radios to provide the formation communication between our vehicles (which we call nodes).  Each node communicates using its two XBees which are each running a separate mesh network for redundancy.  The nodes exchange state information that they receive from the Pixhawks (GPS position, etc.) and then use that information to determine where they are in the formation and where they need to go.  Since they all share their information, there is no one master or leader of the formation.  

We have two primary modes.  In the first mode, the vehicles are provided a GPS location and a formation shape which they then fly to and establish a formation.  We can change the formation position and shape by sending an update from our ground control station (GCS) using some custom GUIs we developed. In the second mode, we can designate one vehicle as the leader of the formation, and the other nodes will automatically begin following that leader as we move it either via RC or from our GCS.  

We've always gotten great help and feedback from this open source community, so we just wanted to return the favor and share the work we've been doing. Here's a video of our latest flight of 5 vehicles.  The video has overlays of two of our custom GUIs as well as the audio from our ground control team.  

UserCode.pde
E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • What's the max number of nodes your network supports? Would this work with 3DR radios? I have read that xbees have some sort of mesh support and I was curious if you are utilizing any of that?

    Thanks,

    Daniel

  • Exactly.

  • So if I understand this correctly you just pipe in/out the information you need from the extra UART port using custom compiled code into the arducopter?

  • UAV_Enthusiast,

    The Beaglebone interfaces to the Pixhawk using one of the extra UARTs (Serial 4/5).  Currently we just parse the serial data directly in some custom interface code we compiled into ArduCopter.  We actually implement the commands in the copter code by updating the position command for the Guided mode. We aren't currently using Mavlink, but it would be a good idea probably to make a custom mavlink message to formalize the communication a little more.

  • Out of curiosity how are you guys issuing your commands? It appears that you reference horizontal movement and WP modes but what functionality is the beaglebone providing to issue those commands? Are you just overriding RC commands or sending a custom mavlink message through what would be the radio port?

    Thanks.

  • This UTM-30LX unit offers a 270° range from 0.1 - 30m. I believe it's also what Alex Kushleyev has used. I don't know what power and weight allowance you have left, but at another 12V and 370g, I imagine it will be challenge to fit, let alone program.

  • Yes, all position data processing is done onboard the vehicles.  It also gets sent to the ground, but just for monitoring purposes.  We could definitely incorporate some sensor to proximity sensing, we just haven't had the opportunity to explore this yet.  We'd also like to come up with a sensor that would provide very wide coverage around the vehicle (i.e. be able to detect objects coming from any angle).

  • Since it looks like all of the processing is done on each unit's Pixhawk, could some Lidar or sonar based proximity sensing be incorporated?

  • Hclmed,

    Yes, currently all the positioning is based on GPS, so like you guessed, we keep the spacing among the vehicles greater than a tolerance that we trust the GPS, maybe about 1 to 3 meters.  We've discussed using some other sensor for finer-grained relative distancing between the vehicles but haven't yet explored that option.

  • Is this based entirely on GPS? If so, is its tolerance limited to the 1 meter or so accuracy of GPS?

This reply was deleted.