Developer

Red Balloon Finder

As many probably know, this year's Sparkfun Autonomous Vehicle Competition (AVC) is similar to last year's except that a new challenge has been added in which we can gain extra points by finding and popping 1m red balloons.  It's pretty difficult to do (see failed attempts #1, #2, #3) but as you can see from the video above it's finally getting the balloons.

The setup is detailed on this page on the developer wiki (scroll down to the Red Balloon in the Example Projects section) but in any case, it uses a Pixhawk running a modified version of AC3.2 (code here, AC3.2 testing thread here).  The Pixhawk has all the regular ArduCopter functionality but in addition it accepts commands from an external navigation computer, an Odroid U3 when it is in the Guided flight mode, or in AUTO and running a newly created MAVLink NAV_GUIDED mission command.  This NAV_GUIDED command takes some special arguments like a timeout (in seconds), a min and max altitude, and a maximum distance and if the Odroid takes too long to find the balloon, or gives crazy commands that cause the copter to fly away, the Pixhawk retakes control.

So while the Odroid is in control, it first requests the pixhawk to rotate the copter slowly around while it pulls images from the Logitech C920 web cam and uses OpenCV to search for blobs of red in the images.  During the search it keeps track of the largest red blob it saw so after the copter has rotate 360 degrees it sends commands to point the vehicle in the direction of that blob and then sends 3D velocity requests 5 times per second to guide the copter towards the top of the balloon. Leonard Hall helped me greatly in improving the control algorithm both for the new velocity controller that was added to ArduCopter and also with the strategy the Odroid should use to best get to the balloon..

All of the communication between Pixhawk and Odroid uses Kevin Hester's Drone API and Tridge's MavProxy.  It's relatively easy to use and allowed me to write all the code in Python.  That code is all here.

3689600919?profile=original

A couple of extra things that came out of this was I managed to use Python's MultiProcessing to allow the image capture and image processing to run in separate processes (so they run in parallel).  It seems that unless images are pulled from the camera at 15 ~ 30hz, they become extremely laggy.  In one test I measured a 1.3 second lag in the images from the camera!  With MultiProcessing that's down to less than 0.2 seconds and there are 2 more cores still free on the Odroid so there's room for more sophisticated work to be done!

For testing, at Tridge's suggestion, I set-up a balloon simulator that works within SITL.  It used the simulated copter's position and attitude and then created a simulated image of the horizon and the balloon (if it was within the field of view of the simulated camera).  This fake image was then fed into the image recognition python code making it an end-to-end test.  This was all possible because both the Odroid and our Simulator run on Ubuntu.  Having the simulator allowed a lot of bench testing before any real flights happened.

If you want to see a really impressive use of image processing in ArduPlane, check out Michael Darling's video here of one arduplane following another.  Michael's set-up is similar in that he uses the same camera and an external nav computer (a beagle bone black) and OpenCV.  I'm not sure what communication method he used between the BBB and the Pixhawk.

Also I hope that over the next year we will enable more image processing computers to work with ArduCopter (and ArduPlane, ArduRover) and maybe lead to the safe precision landings Jesper Andersen suggested in this blog post.

All feedback welcome!

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • Hi Randy,

    I am still rather new to this field of computer vision and am currently doing a similar project regarding object tracking on a drone with a Odroid XU4. I would like to know more about your balloon finder simulation. It would of great help if you could explain to me how the simulated horizon and balloon was created and how the fake image was fed into to the python code. THANK YOU!

  • OK I got it running on the simulator :-)

  • Great , Thanks for your help Randy !

    I already have the RPI2 running OpenCV so I'll give it a try. Based on the result, I might get a NAVIO2 providing there is enough processing power left (tweaking the multi-threading and the RT Kernel). My concern with this kind of setup is the USB bottleneck when streaming a camera feed and having the downlink tp the GC Station over USB WIFI . It happened few times when I was downstreaming  MJpeg from a cheap non-compressed Logitech C170 running on  with my BBBMINI... Fortunately it was on the bench.. I will get a C920 this time.

    Alternatively, I might be using the BBBMINI as the Autopilot and keeping the USB downlink on the BBB side, so the guided mode would be over Ethernet. for the ''Balloon Finder''  and over WIFI for the GC Station.

    Anyways, that is an amazing project -Kudos to You - and I am looking to get you updated on gitter.

  • Developer

    @Patrick, great that you're going to give it a try.  Feel free to ask me questions here (or on gitter) if you get stuck.  If I were going to do this again I would consider using a NAVIO2 on a RPI2 because the vision and flight control could be done on a single board so it would overall be smaller.  There's some danger the vision processing would interfere with the flight code unless care was taken to set the priority of the flight controller above the dronekit stuff.

    If space is not a big concern then I think the Odroid is still a very good choice.

  • Hello Randy,

    I am planning to start this project, and I would like to know if you still recommend using the Odroid (now the XU4)  or you would now use another platform in the same price range ?

    Best Regards

  • @Randy,

    Thank you, Sir :)

  • Developer

    @dronebohne,

    Yes, that looks like a good way to power it.

  • Would that be how to power the Odroid??3702047803?profile=original

  • Hi @Randy,

    I'm coding for a project similar to your red balloon finder, with similar settings (quad + pixhawk + odroid u2 + cam).

    I wanted to ask if it's better to drive the drone via a RC override message and direct channels PWM assignments (i.e. channel 3 1500, channel 1 1600, channel 2 1300 while in ALT_HOLD/LOITER)  or to use the guided mode.
    I've checked your python code and I've seen that you used the guided mode. However I haven't found enough docs about it and I haven't understood well how to use that mode, while, on the other hand, I've already implemented the RC override functions, even though i'm having some difficulties to define a good relationship between PWM value & duration for low speed/short distance (i.e. if it's better to set RC 1 1600 for 2 sec or RC 1 1900 for half sec).


    In guided mode, how can you specify to keep a steady height?
    Thanks

  • Developer

    @Adam,

    You could use a ground station to switch the vehicle into Guided mode if that Pilot can't flip the flight mode switch directly for some reason.  By the way we always recommend having a transmitter available to retake control in a manual flight mode (like Stabilize or Althold) because we're still not at the point where the autopilot can get itself out of every situation on it's own.

    If you're going to use AUTO mode (instead of Guided) you'll need to use Mission Planner or AP Planner2 before take-off to load the mission which contains the NAV_GUIDED_ENABLE command.

    best of luck.

This reply was deleted.