A 350mg, Omnidirectional 6DOF Optical IMU/Optical Flow Sensor

 The ArduEye prototype (left) and the finished sensor (right)

 

 

In a previous post, I demonstrated that the ArduEye platform could be used to prototype a 6DOF vision system for optical flow odometry. The goal is to make a vision system for the Harvard University Robobees Project.

 

After the success of the prototype, the next step was to design a board that was as small and light as possible. The result is shown below:

 

Main components of vision system

 

The vision system consists of two back-to-back Stonyman vision chips, an Atmel ATMEGA 328P microcontroller, an oscillator (16Mhz), and a voltage regulator. The chips have flat printed optics (as described previously) with slits in order to take one-dimensional images of the environment. Even better, the Atmel has the Arduino bootloader, so the sensor is an Arduino clone and can be programmed through the Arduino IDE. The entire system weighs approximately 300-350 milligrams and has dimensions of 8x11 millimeters.

 

The following video shows that motion along all six axes can be distinguished. Some axes are stronger than others, and the Y translation, in particular, is weak. However, the results are promising and with a little optimization this could be a useful addition to a sensor suite.

 

I'd like to gauge the interest for an integrated Arduino clone vision sensor similar to this, but maybe not as compact and minimal. This would be most likely a one-sided vision chip with optics and an Arduino clone processor integrated on a small, single board. The size would be about that of a penny and weigh a half a gram. The user would have control over which pixels are read and how they are processed through the Arduino environment.

 

 

Views: 2701

Comment by Jack Crossfire on April 27, 2012 at 2:07pm

Since you oriented the sensor differently for each axis, you showed the different sensors gave the same reading when tracking the same features.

Comment by Geoffrey L. Barrows on April 27, 2012 at 2:26pm

Jack- You are correct in that each visual orientation provides the same general visual stimuli. However the pixels themselves are not exactly the same between orientations. It would be nice to have a proper gantry to do a full 6DOF test without having to reorient the sensor three times.

Comment by Fedor Korsakov on April 27, 2012 at 5:01pm

Some of my research deals with HCI, and I'd love to see a way to step away from using cameras to track user's head and hands movements. Do you plan to incorporate an IMU into the system next? An integrated IMU+optical sensor module of such kind could be very useful for a range of applications if sufficiently accurate and robust in changing environment.

Comment by microuav on April 28, 2012 at 12:46am

If you want to test it on DelFly II or even on Delfly micro let us know www.delfly.nl

Comment by Geoffrey L. Barrows on April 30, 2012 at 2:13pm

Fedor- Yes, we do plan to incorporate a MEMS IMU in an upcoming project. I don't think vision should replace a mechanical IMU, just complement it.

Monroe- This one most likely won't track stars since we're using pinhole optics and the power supply is a bit noisy, but I bet the same basic principle *could* be used to track stars, provided the image sensor is sensitive enough.

Comment

You need to be a member of DIY Drones to add comments!

Join DIY Drones

Groups

Season Two of the Trust Time Trial (T3) Contest 
A list of all T3 contests is here. The current round, the Vertical Horizontal one, is here

© 2019   Created by Chris Anderson.   Powered by

Badges  |  Report an Issue  |  Terms of Service