Nice work! Jump ahead to 0:41 for the good stuff. From the video description:

In this video we demonstrate the capabilities of our monocular pose tracking system. It consists of infrared LEDs on the quadrotor and an infrared camera on the arm of a ground robot. Since we know the positions of the LEDs on the quadrotor, we can precisely estimate its position and orientation. We are able to stabilize a quadrotor under illumination changes up to complete darkness. Our algorithm is robust to false detections. This with an additional infrared LED, which we move closely to the other LEDs. To estimate the pose of the quadrotor, we need at least four LEDs. However, five are currently mounted. Our system can handle the occlusion of an LED and is still precise enough to stabilize the quadrotor. Since we track the quadrotor relative to the camera, the quadrotor stays with the ground robot, even if it is moving. Our system works also outdoors.

The system is described in our ICRA 2014 paper:
Matthias Faessler, Elias Mueggler, Karl Schwabe and Davide Scaramuzza: A Monocular Pose Estimation System based on Infrared LEDs. IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, 2014.
http://rpg.ifi.uzh.ch/docs/ICRA14_Fae...

The code can be downloaded from our website:
http://rpg.ifi.uzh.ch/software_datase...

Views: 1123

Comment by Joe Thompson on April 9, 2014 at 6:16am

very cool.

If you had fixed infrareds on an APM copter, and a static camera on the ground, how would you transmit positional data to the APM? via some kind of MAVlink script? (i'm not exactly sure what this means, i've just seen the term thrown around... this means you would be using the telemetry system, right? would you also be able to connect via mission planner at the same time?

Comment by Martin on April 9, 2014 at 7:40am

Somebody call the FAA. Part of that video was flown close to buildings and people ;p . Try that in bright sun light.

Sorry but a waste of time. Some university comes out with some other way to control a multi with a camera and computer this week.


Developer
Comment by Randy on April 9, 2014 at 6:37pm

@Joe,

     Yes, if someone wanted to try this with ArduCopter you'd likely need to create a new MAVLink message although there are already a lot of MAVLink messages defined (see them here).  The SET_ROLL_PITCH_YAW_THRUST message in particular looks like it could be used to allow the ground station to tell the copter which way to lean or turn to keep it in the right place.  There would need to be a bit of work on the arducopter side to decode and use those messages.

Comment by Darrell Burkey on April 9, 2014 at 7:22pm

I'm always amazed by people who feel comfortable walking up to a copter hovering in the air and grabbing it. I was asked to do a test recently while holding my copter over my head to create some logs of it's performance. Although it looked stupid, I actually put on my motorcycle helmet and gloves before doing it. I've had one trip to the hospital for a small incident and that was more than enough warning for me.

Comment by Kenn Sebesta on April 9, 2014 at 11:32pm

@Martin-- 

I'd give them a bit more credit. UZH is across the street from ETHZ, and there's a lot of cross-pollination that goes on. In addition, Davide Scaramuzza is a rising star in the robotics vision world so my ears always perk up when I see research out of his lab.

Science is a continuous refinement of ideas, and even if UZH hasn't found the magical formula yet they've gotten us further along the path. If you have the time you might give a look at their paper, it's well written and accessible.

Comment by Antoine LECESTRE on April 10, 2014 at 12:52am

very interesting project! I think it would be even more interesting if the vision system was on the drone and the IR leds on the ground. That would open new possibility and wont be "blind" after direct exposure to the sun. Main issue is off cource computationnal power.
Maybe a CMUCam Pixy with a IR passband lens can do that. I know it's rolling shutter and that 100hz would be better, but considering it's price it sounds good. Connected with I2C or SPI it should be able to detect the blob of the IR LED and send the positions of the markers to a Pixhawk at 50Hz. It would be just a few data as the vision recognition algorithms would have been done onboard the Pixy. By implementing a few P3P algorithms on a Pixhawk and fusing them with pose estimation from the IMU we should be able to control the copter and make it hover relative to the markers.

If that sounds possible to you, I would give a try as I will receive my Pixy in a few day. I understand most part of the Arducopter code but I definitely will need some advice from you to do it in the right way.

Comment

You need to be a member of DIY Drones to add comments!

Join DIY Drones

Groups

Season Two of the Trust Time Trial (T3) Contest 
A list of all T3 contests is here. The current round, the Vertical Horizontal one, is here

© 2019   Created by Chris Anderson.   Powered by

Badges  |  Report an Issue  |  Terms of Service