As part of Centeye's participation in the Harvard University Robobee project, we are trying to see just how small we can make a vision system that can control a small flying vehicle. For the Robobee project our weight budget will be on the order of 25 milligrams. The vision system for our previous helicopter hovering system weighed about 3 to 5 grams (two orders of magnitude more!) so we have a ways to go!

We recently showed that we can control the yaw and height (heave) of a helicopter using just a single sensor. This is an improvement over the eight-sensor version used previously. The above video gives an overview of the helicopter (a hacked eFlite Blade mCX2) and the vision system, along with two sample flights in my living room. Basically a human pilot (Travis Young in this video) is able to fly the helicopter around with standard control sticks (left stick = yaw and heave, right stick = swash plate servos) and, upon letting go of the sticks, the helicopter with the vision system holds yaw and heave. Note that there was no sensing in this helicopter other than vision- there was no IMU or gyro, and all sensing/image processing was performed on board the helicopter. (The laptop is for setup and diagnostics only.)

The picture below shows the vision sensor itself- the image sensor and the optics weigh about 0.2g total. Image processing was performed on another board with an Atmel AVR32 processor- that was overkill and an 8-bit device could have been used.

3689435112?profile=original

A bit more about optics: In 2009 we developed a technique for "printing" optics on a thin plastic sheet, using the same photoplot process used to make masks for, say, making printed circuit boards. We can print up thousands of optics on a standard letter size sheet of plastic for about $50. The simplest version is a simple pinhole, which can be cut out of the plastic and glued directly onto an image sensor chip- pretty much any clear adhesive should work.The picture below shows a close-up of a piece of printed optics next to an image sensor (the one below is a different sensor, the 125 milligram TinyTam we demonstrated last year).

3689435074?profile=originalThe principle of the optics is quite understandable- a cross section is below. The plastic sheet has a higher index of refraction than air, thus a near hemisphere field of view of light may be focused onto a confined region of the image sensor chip. You won't grab megapixel images in this manner, but it works well for the hundreds of pixels needed for hovering systems like this.

3689435090?profile=original

We are actually working on a new ArduEye system, using our newer Stonyman vision chips, to allow others to hack together sensors using this type of optics. A number of variations are possible, including using slits to sense 1D motion or pinhole arrays to make a compound eye sensor. If you want more details on this optics technique, you can visit this post, or you can pull up US patent application 12/710,073 on Google Patents. (Note: We are planning to give a blanket license of the patent for use in open hardware systems.)

(Sponsor Credit: "This work was partially supported by the National Science Foundation (award # CCF-0926148). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.")

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • Yes, the same principle works for lateral control. With just this one sensor we were able to control not just yaw and height but horizontal motion as well, though it was a bit tricky to do so. I'd show the video but it was taken in an embarassingly messy lab. :) With two of these sensors back to back, which gives you almost a 360 deg field of view, horizontal control is easier and the system performs much better as a whole- more visual texture to lock onto.

  • Thanks for the answer Geoffrey !

    And what about lateral control ? Here I assume that it's the passive stability of the coax that keeps it roughly in the same position (?). Could you use a similar vision-based technique to control the lateral motion of a platform ?

    Best,
    Adrien 

  • Hi Adrien,

    Thank You for your response. For all of our micro helicopter experiments using the Blade mCX, we did not use any inertial sensing. It is true though that some assumptions about the environment need to be made- there needs to be some light, and there needs to be some texture for the system to lock onto. One of the benefits of a wide FOV though is that there is an increased chance of finding something to lock onto. As for distance to walls etc- yes this matters and in a larger room the helicopter would obviously wiggle around more than in a smaller room, but generally that would be acceptable- you are in a larger room after all. In practice we have not found this to be an issue.

    This technique, used alone, is best for when you have very little payload capability and absolutely need to control height, or when cost is an issue (if manufacturing in the millions, say for toys). On larger helicopters that are able to carry inertial sensing, I'd say go ahead and both inertial sensing and vision to maximize robustness. (Insects rely on multiple sensing modes so why not robots?)

    Geof

  • Hi Geoffrey,
    Thanks for sharing! Your work is very interesting. I'm impressed by the stabilization and how it only uses the small vision sensor. No inertial sensor is even required ? I wonder if you have to make assumptions about the environment (like flat walls, avg distance to obstacles, etc.) ?
    Best, and good luck for the future developments,

    Adrien

  • PS I hope all this talk of patents is not a turn-off to people. I'm more excited about the technology.

  • @Alex- Good question- This is something we'll have to draft up with our legal counsel since there really isn't anything we can draw upon (as far as I know), and we don't want to screw this up (for us or anyone). We want to address not just variations of designs we make (which is the domain of copyright law), but entirely new designs (copyright law no longer applies). Basically we want to allow free experimental and research use, and for commercial use the blanket license would cover fully OSHW-definition compatible designs (CAD drawings and source code released etc.) that practice the invention. Attribution would also be nice. :) So if company XYZ made or sold a gizmo that incorporated this invention, that entire gizmo would have to comply with the OSHW definition. Any other use would be a separate arrangement.

    Also, I should clarify that the patent application is still "pending". But it would be good to address this sooner rather than later.

    Actually, I'd be very curious to hear what others here think of such a blanket license, or if there are other people facing this issue.

    In the mean time, if other people here want to try experimenting with such optics- please go ahead and do so and let me know if you have questions!

  • Moderator

    That is really cool, and I think its great that you are creating blanket OSHW license, do you know when/where it will this be released?

  • Moderator
    Wow
This reply was deleted.