Visual Tracking Gimbal Control

Hey peeps,

So I've been playing around with drone videography lately, find it really hard to film stuff while controlling the camera gimbal to keep things i'm interested in frame.

Being a roboticist, I hacked together a working prototype that uses the video stream to track anything I want, using computer vision algorithms:
https://www.youtube.com/watch?v=QJW9ReKU-0I&list=UUy6SYEB_Gk40j9NOy8RI5aQ

Currently, I have a video tracking software running on PC and control a gimbal to keep the region of interest I select in frame. I'm also planning  to dev. an android/ipad app so that you can just multi-touch define a subject with 2 fingers.

Is anyone out there also be into something like this? Would love to hear your feedback!

Thanks in advanced

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –

Replies

  • MR60

    This is absolutely great! You created this apparently so easily like we are just breathing or what ? Or you are a yet to discover genius ?

    Anyhow we need this as a add-on kit we can use on all of our APM vehicles : rover, plane, copter. For rover we can use an I7, no problem. However for copter particularly it would be good to know if this can run on a beagle bone black (RT kernel which is a bit less processing capable than the plain vanilla kernel). Do you intend to test on this ARM environment to verify it holds a real time 100hz cycle rythm ? (required ideally with no lag).

    • I have not tested it on BBB, but would love to try. Another potential platform with a bit more juice is the NVIDIA TK1 board or the ODROID boards.

      Yes that's is definitely on the roadmap! Based on the feedback I'm getting from this thread, it seems an ARM port is bumping up in priority. Thanks alot for the feedback! I plan to drop this on my Tegra board this week, and will let you know how it goes.

  • Fantastic project Yan,

    We would be keen to learn how you finish up with this. Have you seen these guys http://www.airdog.com they recently had a kickstarter program, this being how we discovered them. https://www.kickstarter.com/projects/airdog/airdog-worlds-first-aut...

    Looks like they possibly utilise a combination of sensors, communicating with the RPAS in order to keep the subject in frame.

    • Thank you!

      Yean I'm aware of Airdog, and more simlarly, HEXO+ which uses vision proessing. They are drone makers that try to sell us a package solutions. But I already have a nice drone and I want to keep using it! And I thought that there are many others in similar situations.

      In terms of taking advanage of other sensors, yes it's in the plan. In particular, IMU is a great way to predict your motion, and so is visual odometry from the camera itself and it's all on the table.

  • I have been trying to get something like this going, too. I am planning/trying to use an IR beacon on the target.

    Unfortunately, I am just a mechanical engineer and struggle with the computer/code side of things...

    http://www.tedweerts.com/category/blog/

    Blog | Theodore Schuyler Weerts
    • The problem with IR is the light saturation in daylight. If you take an IR camera and look at a scene with sunlight, everything will look white and washed out.

      People have used modulated IR to differentiate between varying light from constant ambient light, but this requires special camera, and for user to wear an IR tag.

      That being said, if the users don't care about wearing something extra, it might make more sense.

  • Coolness factor over 9000. Useful too!

  • This looks great! Good job! Are you going to release the code? Looks like fun to take a look at. Did you write it in opencv?
    • Thanks man! Yeah I do hope to release it at somepoint, and yes I take advantage of OpenCV alot.

      • Can't wait to take a look at it! Looks like there are the obvious uses for it but also some other fun things to implement it on! Anxiously waiting! Are you using a brushless gimbal (seems so, very smooth movement), also what type of CPU are you running it off of? Is it CPU intensive?

This reply was deleted.

Activity

DIY Robocars via Twitter
Sunday
DIY Robocars via Twitter
Saturday
DIY Robocars via Twitter
Saturday
DIY Robocars via Twitter
RT @f1tenth: Say hi to our newest #F1TENTH creation for @ieee_ras_icra next week in Philly. It’s going to be huge! 😎 🔥 @AutowareFdn @PennEn…
Saturday
DIY Robocars via Twitter
May 11
DIY Robocars via Twitter
May 8
DIY Robocars via Twitter
RT @SmallpixelCar: Noticed my car zigzagged in last run. It turned out to be the grass stuck in the wheel and made the odometry less accura…
May 8
DIY Robocars via Twitter
RT @SmallpixelCar: Test my car. RTK GPS worked great. Thanks @emlid for their support. https://t.co/EkQ6qmjmWR
May 8
DIY Drones via Twitter
RT @chr1sa: @kane That's @diydrones circa 2009. Still have a box of those Canon cameras that we used to strap into planes, just like this.…
May 3
DIY Robocars via Twitter
RT @chr1sa: Our next @diyrobocars race is going to be outside at a real RC racetrack in Fremont on May 28. Fully autonomous racing, head-to…
Apr 30
DIY Robocars via Twitter
RT @f1tenth: Our Spring 2022 F1TENTH course @PennEngineers is coming to an end with a head-to-head race as a big finale. So proud of our st…
Apr 26
DIY Robocars via Twitter
RT @DanielChiaJH: I wrote a thing! Throughout the development of my @diyrobocars car I've been using @foxglovedev Studio to visualize and d…
Apr 23
DIY Robocars via Twitter
RT @SmallpixelCar: My new car for high speed. Low body, everything ( @NVIDIAEmbedded Jetson Xavier NX, @emlid RTK GPS, IMC) under the deck…
Apr 23
DIY Robocars via Twitter
Apr 21
DIY Robocars via Twitter
RT @f1tenth: F1TENTH Race training setup @PennEngineers for our upcoming ICRA2022 @ieee_ras_icra competition. @OpenRoboticsOrg @IndyAChalle…
Apr 21
DIY Robocars via Twitter
RT @fatcatFABLAB: Proud to be hosting a restarted DIY Robocars NYC Meetup April 26. Come by if you want to talk about and race self-driving…
Apr 17
More…