3689338769?profile=original


Find more videos like this on DIY Drones

Find more videos like this on DIY Drones
Some of you may have seen Centeye's old website showing our earlier work flying optical flow sensors on small RC-class aircraft. Much of this work was sponsored by DARPA and the U.S. Air Force. More recently we have been hacking an eFlite Blade mCX, a very stable small 7" contra-rotating coaxial helicopter.The helicopter is a basic mCX airframe, minus the front canopy, and with the out-of-box green receiver / controller board replaced with one of our own design. Our own board sports an Atmel AVR32 microcontroller and an AT86RF230 wireless chip as well as carbon resistor strips and transistor circuits to implement the swashplate servos and rotor speed controllers. We have also integrated into the board a 6DOF IMU using standard MEMS components.In front of our controller board is a sensor ring with eight custom designed vision sensors mounted on a flexible circuit board and a processor board having another AVR32. They are stacked vertically via 0.8mm board-to-board connectors- Thank You cell phone industry! The processor board operates the eight vision sensors (which form an nice parallel system), acquires the imagery, computes optical flow, and sends high level control signals to the controller board. The whole sensor ring, including processor, flexible ring, vision sensors, and optics together weigh about 3 grams.Using a variation of control algorithms developed by Sean Humbert's lab at the University of Maryland at College Park, we were able to have this helicopter "hover in place" for up to six minutes straight. We could even perturb the helicopter slightly by manually moving it, and it would attempt to return to its original position. We have been able to get this demonstration working in a variety of room sizes and illumination levels. For these experiments, we did not use the IMU- the helicopter held its position (including yaw angle) using purely visual information. The man in the videos above is Travis Young, who has been executing the control aspects of this project at Centeye.Just to make it clear- All sensing, processing, and control is being performed on the helicopter. There is no human in the loop in these videos.Centeye is participating in the NSF-funded Harvard RoboBees project, led by Harvard EECS professor Rob Wood. As part of this project, we will be building vision sensors weighing on the order of tens of milligrams. If all goes well, we should have our first prototype at this scale by this summer!The RoboBee project will also let us do something that I personally have been wanting to do for a long time- to develop a portfolio of purely consumer/academic/hobbyist vision sensors that I can get into the hands of people like the members of this community! I'll be starting a thread soon in the "Sensors and IMUs" forum where I'd enjoy discussing this with everyone.
E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • Hi,

    really cool set-up. thanks for sharing it.

    However, could you explain in more details how the 8 vision sensors are used to stabilize the system close to a set point ?

    Do you first need to record a given spatial configuration from the 8 sensors, and then you let the system look for it, by correlation with the recorded position maybe ?

     

  • Developer
    This should be perfect to rescue job in wreckage.
  • We got it working in a larger hallway say 10m-20m wide. We tried it in a sun-lit atrium but that didn't work because the sun caused thermal currents that carried the helicopter away. Outside doesn't work for the same reason, though we could try it one day when there is a perfect calm.

    We did get it working in low light environments (1-2lux) and recently in a dark room using LEDs for illumination.

  • Developer
    That's freaking awesome. It will not work so well when outdoor (empty space), right?
  • Adrien- They are there, just very small.
  • Wow, impressive results!
    How comes I can't see any optics in front of the sensors actually ?
  • Thanks for your comments everyone. Jianquo- we'd like to but I don't know about cost right now. Right now it takes us a few hours to assemble one in-house and it is a tedious process (lesson learned- don't drink coffee before using the wire bonder!), but there is clearly room for simplifying this process. Jack- both have their application- in a "perfect world" we'd add as many different sensor modalities as possible- insects do the same. Howard- yes I do remember you- congratulations on your own progress! We do do have one of your Blackfin board sets and I am impressed with what you've done.
  • Geoffrey - Very nice. I would love to learn more. We met at the DARPA LANdroids briefing a few years back and I think you have one of our Blackfin board sets.
  • Building custom electronics on that scale is super expensive & really hard if it's not your day job. The resolution must be real low to handle 8 cameras. The pendulum is swinging back & forth between machine vision & lidar. This week it's machine vision. Howard Gordon is probably your biggest customer from this group.
  • Very nice! Do you plan to sell the sensor ring? How much will be the cost?
This reply was deleted.