Find more videos like this on DIY Drones
Find more videos like this on DIY Drones
Some of you may have seen Centeye's old website showing our earlier work flying optical flow sensors on small RC-class aircraft. Much of this work was sponsored by DARPA and the U.S. Air Force. More recently we have been hacking an eFlite Blade mCX, a very stable small 7" contra-rotating coaxial helicopter.The helicopter is a basic mCX airframe, minus the front canopy, and with the out-of-box green receiver / controller board replaced with one of our own design. Our own board sports an Atmel AVR32 microcontroller and an AT86RF230 wireless chip as well as carbon resistor strips and transistor circuits to implement the swashplate servos and rotor speed controllers. We have also integrated into the board a 6DOF IMU using standard MEMS components.In front of our controller board is a sensor ring with eight custom designed vision sensors mounted on a flexible circuit board and a processor board having another AVR32. They are stacked vertically via 0.8mm board-to-board connectors- Thank You cell phone industry! The processor board operates the eight vision sensors (which form an nice parallel system), acquires the imagery, computes optical flow, and sends high level control signals to the controller board. The whole sensor ring, including processor, flexible ring, vision sensors, and optics together weigh about 3 grams.Using a variation of control algorithms developed by Sean Humbert's lab at the University of Maryland at College Park, we were able to have this helicopter "hover in place" for up to six minutes straight. We could even perturb the helicopter slightly by manually moving it, and it would attempt to return to its original position. We have been able to get this demonstration working in a variety of room sizes and illumination levels. For these experiments, we did not use the IMU- the helicopter held its position (including yaw angle) using purely visual information. The man in the videos above is Travis Young, who has been executing the control aspects of this project at Centeye.Just to make it clear- All sensing, processing, and control is being performed on the helicopter. There is no human in the loop in these videos.Centeye is participating in the NSF-funded Harvard RoboBees project, led by Harvard EECS professor Rob Wood. As part of this project, we will be building vision sensors weighing on the order of tens of milligrams. If all goes well, we should have our first prototype at this scale by this summer!The RoboBee project will also let us do something that I personally have been wanting to do for a long time- to develop a portfolio of purely consumer/academic/hobbyist vision sensors that I can get into the hands of people like the members of this community! I'll be starting a thread soon in the "Sensors and IMUs" forum where I'd enjoy discussing this with everyone.
Comments
Hi,
really cool set-up. thanks for sharing it.
However, could you explain in more details how the 8 vision sensors are used to stabilize the system close to a set point ?
Do you first need to record a given spatial configuration from the 8 sensors, and then you let the system look for it, by correlation with the recorded position maybe ?
We got it working in a larger hallway say 10m-20m wide. We tried it in a sun-lit atrium but that didn't work because the sun caused thermal currents that carried the helicopter away. Outside doesn't work for the same reason, though we could try it one day when there is a perfect calm.
We did get it working in low light environments (1-2lux) and recently in a dark room using LEDs for illumination.
How comes I can't see any optics in front of the sensors actually ?