Drone color tracking

This is about some experiments I’m doing using python and openCV and of course one of my drones… In this case the drone used for taking the videos, was my hexacopter (Alduxhexa).

The point is to explore different techniques and algorithms to track colours, objects and the follow them using a multicopterIn the video you will see my hexacopter flying with a GoPro pointing always down (to the ground). I want to be able to identify colours on the ground.

drone-color-tracking.png?width=600

There is now lots of similar examples and big projects already doing computer vision with drones, but I just wanted to show you my experiments, share to everyone my code, get some pointers and of course doing some collaboration with people interested.

I'm using:

  • hexacopter
  • gopro hero 3
  • pixhawk
  • tarot gimbal
  • python
  • opencv

This is a picture of my hexacopter used to get those videos:

P1000626.jpg?width=600

My github with the code is here. It contains several more examples, like face detection and so forth... This code works on raspberry pi, mac.

The next step is to make the hexacopter centre and follow an specific target using a extra onboard computer, perhaps a rpi or similar. 

If questions or suggestions don't hesitate in contacting me.

Check my blog http://aldux.net/blog

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • @Aldo,

    I have been reading your blogs and github with great interest. We have created a group to build and develop video tracking and navigation system using companion computers and I invite all interested parties to take part of the discussion on companion computer  working group

    I have added a proposal for companion system architecture here ,  hopefully you will join us 

  • @Randy Thanks Randy! I have seen all the code in that project and other similar, very helpful, and I will use some parts to write the control for this one, I will control using velocity and perhaps acceleration vector, sending the SET_POSITION_TARGET_LOCAL_NED, but I cannot seem to find more info about that, just from the red-balloon and the smartcamera of Daniel Nugent... is there a plan to include that into droneAPI? I've done in the past rc override to control a multiwii quadcopter and a mocap system, take a look here: http://aldux.net/blog/tego-position-control/ and also watch this video https://www.youtube.com/watch?v=suD0DdpGi8k

    @Vinicius thanks! show some results!! ill also post my performance comparison ;)

    Loading...
  • Looks good! I have a ODROID-XU3, will take a look at your code! 

  • Developer

    Looking good.  We've got some mavlink messages available that you can use to allow the companion computer (as we call it) to control the vehicle.  In particular I used the velocity controller for the red-balloon-popper.

  • Thanks Gary!

    I indeed have a Odroid U3 with the emmc module, I'm waiting for the parts to arrive in scotland, so, maybe next week ill do the testings... 

    I already tested some of my algorithms in rpi b+ and in the new rpi2, and as expected, the rpi2 is more powerful! :)

    Look at some of my companion computers: Companion-computers.jpg?width=400

    Cheers!!

  • Hi Aldo,

    This is really great, excellent results. 

    A few thoughts for the real time on board version.

    The little Odroid U3 quad core is an excellent small computer for this and it might be worth looking at their newer XU3 eight core as well.

    I have a U3 myself, make sure you get all the stuff you need with it, (like the Linux EMMC module and maybe the USB - UART module) and the IO shield is handy for the U3.

    http://www.hardkernel.com/main/products/prdt_info.php?g_code=G14044...

    The Odroid U3 and XU3 are available with either a preloaded Linux or Android EMMC module.

    Also, the Nvidia Tegra Jetson (with 192 Cuda GPUs) is more than powerful enough for this with it's 192 GPU cores, but probably a bear to program.

    https://developer.nvidia.com/jetson-tk1

    Best Regards,

    Gary

  • for the video was post processed after the flight...

    I'm currently waiting for a couple of carbon arms that tragically died due to a very strong gust in scotland... hehehe

    But after getting those arms, I'll do a real time video using a rpi 2 and a odroid, and of course will post both tests.

  • MR60

    Nice!

    Does the image processing happen in real time on the drone during flight or is it post processed after the flight ?

This reply was deleted.