Hi,
Some people asked for a few more details on the mouse sensor position hold I am experimenting with on my quad (originial post here), so here we go:
As I mentioned in the original post, the sensor is an ADNS 2610 made by Avago. This is a fairly ubiquitous sensors found in many cheap mice - the reason I chose this one is two-fold: there is good documentation available for it (e.g. the datasheet, or just google for it, there are plenty of people who hacked it), and it provides a mode in which the 'raw' 18x18 pixel image can be read out of it, significantly simplifying the setup of appropriate optics.
I ended up not finding a mouse with that particular sensor, so I ordered the raw sensor on Digikey instead. According to the datasheet, 3 caps and 1 oscillator are required in a minimal circuit - all of these are cheaply available at digikey.
Above is an image of my prototype circuit: on the right you see a lens with a screw-type lens barrel from a cheap (around $10 at Walmart, I think) keychain camera, in the center you can see a raw ADNS 2610 and on the right is the circuit itself. The lens I ended up using is not this exact one, but one from a different camera which looks very similar - either one would have worked.

On the second image you can see the sensor again (this time upright), as well as the bottom of the pcb (never mind the sloppy soldering). In the prototype, I basically hot-glued the lens barrel lined up to the bottom of the board and used the screw-mechanism to adjust the focal length (more on that below). My final design looks very similar, except that I used a single-piece perforated circuit board and drilled a 1/4" hole in the center for the sensor aperture - I did not include a picture because it is wrapped in a small, hot-glued foam box and mounted on the quad at the moment, and I am too lazy to take everything apart.

The image above shows the raw image data output of the mouse sensor with the optics adjusted. You can see my hand in front of a bright wall, making the hand itself appear dark. The distance sensor-hand is approximately 0.5 meters (give or take). The image is read out from the sensor via an Atmega which passes it on to a Perl/Tk script via serial. The serial protocol used by the ADNS 2610 is documented in the datasheet and fairly easy to emulate on a microcontroller. Since I knew that the keychain camera lens must have a focal length of a few millimeters, I basically determined the correct focal length by trial-and-error (I pointed it at a dark shelve on a bright wall and adjusted until it was clearly visible). Because the focal length is short compared to the object distance (~5mm versus something of the order of 1m), this setup does a reasonable job at focusing anything further away than 0.5-1m. On the actual quad, I do not look at the raw image data, but let the optical flow implementation on the mouse sensor to the hard work for me and just read x- and y- displacements.
I use the altitude data from the sonar range finder together with tilt information from the IMU to get an absolute position reference.

The mouse sensor rigged in this way appears to work quite well in daylight conditions outside, I have not tried to use it indoors yet (I would imagine one needs a reasonably well-lit environment). I did notice once that the quad seemed to get slightly confused by its own shadow in bright sunlight around noon, but I have not systematically tested this. You can see a screenshot of my GCS during a typical flight above. The window on the left just shows the attitude and other assorted information about the quad (the software is written in Perl/Tk - I mostly use Linux). The window on the right shows the ground-track in position-hold mode as detected by the quad. The blue bar indicates one meter, the trace starts in the center of the cross-hair. It did a reasonable job holding its position, it was a tad windy that day, I think. I am sure with some patience one could fine-tune the PID parameters to tighten the position-hold considerably.
Hope this was interesting,
Marko
E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • @Öncü: You are welcome. Here's a survey paper I wrote back in 2002 that talks more about bio-inspired use of OF in UAVs. Note: There is one glaring error- Equation 1 should read "sin" not "cos".

    http://www.centeye.com/downloads/UAVS02.pdf

    Regarding FPGAs: I recall having seen such things at various times, but nothing really flown. The challenge is not so much making the best optical flow measurement possible, but tuning it for the specific application. A good design with a microcontroller will outdo a poor one with an FPGA or GPU.
  • @Geoffrey - Thanks for pointing out the work of Javaan Chahl and Sean Humbert. That may be the way to go in the future. Did you see any work using fpga chips for optical flow?
  • @Marko: Having access to the code could allow one to pursue a wider variety of algorithms than the fixed set included in those chips. As for plans, they involve releasing more things under open hardware licenses. I should ask: what would you like my plans to be?

    @Öncü: Good selections! Check out also the work of Javaan Chahl and more recently Sean Humbert.
  • Developer
    Öncü master of diggers :)

    Great papers I can say.
  • @Geoffrey: Very interesting. I suppose one could dabble with one of the lower resolution ones and an Atmega-class microcontroller and create and open-source version of an optical-flow sensor. Having full control over the dynamic range, etc. might make it easier to optimize it for UAV purposes. What are your plans for these sensors?
  • @Marko (and everyone else): Regarding your comment on "other sensor"- I have about 4000 image sensor die of various resolutions ranging from 8x8 to 480x256. Some are very simple- analog pixel out and pulsing a clock advances to the next pixel. Others are more complex and have on-chip ADCs and other bells and whistles. I've even been dabbling with "opening" the sensor design- we actually did a "soft" release of such a sensor. As for the image sensor chips- I personally designed them so I can actually tell you how they work (though they still sometimes surprise me- analog chips are kinda like that). I am all for putting these in some sort of open vision sensor design.

    The chips are all in bare die form so they'd have to be wirebonded to a board or package. No problem- we have a manual wirebonding machine- which is good for prototyping / small qtys.
  • @ionut: The native resolution of this particular sensor is 18x18, I have looked at some other, most of them are in the 16x16 - 18x18 range.
    In principle you could write your own optical flow routines, but in practice you would just replicate the same functionality that the mouse sensor gives you for free -I am giving the engineers who designed this sensor the benefit of the doubt and assume they tuned the algorithm quite well (in fact that is probably the most proprietary part of the sensor - anybody can make CCDs these days). Arguably they tuned the sensor for mouse applications, so maybe you are right and something could be gained with a home-brew algorithm tuned for larger range sensing. At this point, I just tried to keep things as simple as possible.
    On the practical side of things: this particular sensor will only give you a full "raw" image at a frame rate of about 4.5 frames/second (internal framerate of 1500fps / # of pixels, i.e. 1500/(18x18)) - I believe this is really meant as a debugging feature, it would be slow to be practical for running your own optical flow algorithm. Of course there might be other sensors out there that don't have this kind of restriction...
    http://range.In/
    See related links to what you are looking for.
  • On the long run the mouse sensor has a low resolution of 40x40?Simply computing the optical flow inside the main microcontroler will deliver better performance.And this way you don't mess with proprietary hardware.Optical flow basicaly considers change in intensity between 2 frames as change in distance.So in an interactive mode will move the deformation field towards the destination.At each step it will apply a gaussian smoothing
  • Hi everybody,

    Some times ago I did similar experiments with ADNS 2610 + Mega 8 + different "optics" (mostly cannibalized camera lenses), the intent beig optical flow assisted autonomous landing of fixed wing UAV.

    Results were acceptable only whithin precise distances from ground, probably because of the focal length of the lenses.

    In any case, there is a correlation between the value of the SQUAL register of ADNS 2610 and the goodness of the optical flow output. Here's what I'd try if I had time :

    - different lenses: use lenses with higher fiald depth, i.e. able to focus far objects as well near objects.

    - multiple sensors: use different sensors with different focal length lenses or different orientations, use the optic flow from the sensor with the highest SQUAL value or do some sort of "voting" / plausibility check on the optical flows returned by all sensors.

    - autofocus: this is the most expensive and complicated. Automatically change the focus of the lens until the SQUAL register returns a (local) maximum. This does not necessarily mean that the ground is in focus from a picture point of view.

    What do you think ? Anybody willing to try ?
This reply was deleted.