The ArduEye prototype (left) and the finished sensor (right)

 

 

In a previous post, I demonstrated that the ArduEye platform could be used to prototype a 6DOF vision system for optical flow odometry. The goal is to make a vision system for the Harvard University Robobees Project.

 

After the success of the prototype, the next step was to design a board that was as small and light as possible. The result is shown below:

 

Main components of vision system

 

The vision srobobee4-187x300.png?width=187ystem consists of two back-to-back Stonyman vision chips, an Atmel ATMEGA 328P microcontroller, an oscillator (16Mhz), and a voltage regulator. The chips have flat printed optics (as described previously) with slits in order to take one-dimensional images of the environment. Even better, the Atmel has the Arduino bootloader, so the sensor is an Arduino clone and can be programmed through the Arduino IDE. The entire system weighs approximately 300-350 milligrams and has dimensions of 8x11 millimeters.

 

The following video shows that motion along all six axes can be distinguished. Some axes are stronger than others, and the Y translation, in particular, is weak. However, the results are promising and with a little optimization this could be a useful addition to a sensor suite.

 

I'd like to gauge the interest for an integrated Arduino clone vision sensor similar to this, but maybe not as compact and minimal. This would be most likely a one-sided vision chip with optics and an Arduino clone processor integrated on a small, single board. The size would be about that of a penny and weigh a half a gram. The user would have control over which pixels are read and how they are processed through the Arduino environment.

 

 

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • Fedor- Yes, we do plan to incorporate a MEMS IMU in an upcoming project. I don't think vision should replace a mechanical IMU, just complement it.

    Monroe- This one most likely won't track stars since we're using pinhole optics and the power supply is a bit noisy, but I bet the same basic principle *could* be used to track stars, provided the image sensor is sensitive enough.

  • If you want to test it on DelFly II or even on Delfly micro let us know www.delfly.nl

  • Some of my research deals with HCI, and I'd love to see a way to step away from using cameras to track user's head and hands movements. Do you plan to incorporate an IMU into the system next? An integrated IMU+optical sensor module of such kind could be very useful for a range of applications if sufficiently accurate and robust in changing environment.

  • Jack- You are correct in that each visual orientation provides the same general visual stimuli. However the pixels themselves are not exactly the same between orientations. It would be nice to have a proper gantry to do a full 6DOF test without having to reorient the sensor three times.

  • Since you oriented the sensor differently for each axis, you showed the different sensors gave the same reading when tracking the same features.

This reply was deleted.