Hi,
Some people asked for a few more details on the mouse sensor position hold I am experimenting with on my quad (originial post here), so here we go:
As I mentioned in the original post, the sensor is an ADNS 2610 made by Avago. This is a fairly ubiquitous sensors found in many cheap mice - the reason I chose this one is two-fold: there is good documentation available for it (e.g. the datasheet, or just google for it, there are plenty of people who hacked it), and it provides a mode in which the 'raw' 18x18 pixel image can be read out of it, significantly simplifying the setup of appropriate optics.
I ended up not finding a mouse with that particular sensor, so I ordered the raw sensor on Digikey instead. According to the datasheet, 3 caps and 1 oscillator are required in a minimal circuit - all of these are cheaply available at digikey.
Above is an image of my prototype circuit: on the right you see a lens with a screw-type lens barrel from a cheap (around $10 at Walmart, I think) keychain camera, in the center you can see a raw ADNS 2610 and on the right is the circuit itself. The lens I ended up using is not this exact one, but one from a different camera which looks very similar - either one would have worked.

On the second image you can see the sensor again (this time upright), as well as the bottom of the pcb (never mind the sloppy soldering). In the prototype, I basically hot-glued the lens barrel lined up to the bottom of the board and used the screw-mechanism to adjust the focal length (more on that below). My final design looks very similar, except that I used a single-piece perforated circuit board and drilled a 1/4" hole in the center for the sensor aperture - I did not include a picture because it is wrapped in a small, hot-glued foam box and mounted on the quad at the moment, and I am too lazy to take everything apart.

The image above shows the raw image data output of the mouse sensor with the optics adjusted. You can see my hand in front of a bright wall, making the hand itself appear dark. The distance sensor-hand is approximately 0.5 meters (give or take). The image is read out from the sensor via an Atmega which passes it on to a Perl/Tk script via serial. The serial protocol used by the ADNS 2610 is documented in the datasheet and fairly easy to emulate on a microcontroller. Since I knew that the keychain camera lens must have a focal length of a few millimeters, I basically determined the correct focal length by trial-and-error (I pointed it at a dark shelve on a bright wall and adjusted until it was clearly visible). Because the focal length is short compared to the object distance (~5mm versus something of the order of 1m), this setup does a reasonable job at focusing anything further away than 0.5-1m. On the actual quad, I do not look at the raw image data, but let the optical flow implementation on the mouse sensor to the hard work for me and just read x- and y- displacements.
I use the altitude data from the sonar range finder together with tilt information from the IMU to get an absolute position reference.

The mouse sensor rigged in this way appears to work quite well in daylight conditions outside, I have not tried to use it indoors yet (I would imagine one needs a reasonably well-lit environment). I did notice once that the quad seemed to get slightly confused by its own shadow in bright sunlight around noon, but I have not systematically tested this. You can see a screenshot of my GCS during a typical flight above. The window on the left just shows the attitude and other assorted information about the quad (the software is written in Perl/Tk - I mostly use Linux). The window on the right shows the ground-track in position-hold mode as detected by the quad. The blue bar indicates one meter, the trace starts in the center of the cross-hair. It did a reasonable job holding its position, it was a tad windy that day, I think. I am sure with some patience one could fine-tune the PID parameters to tighten the position-hold considerably.
Hope this was interesting,
Marko
E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • Developer
    Öncü that looks great but Noooooooooooo I don't want to have Microsoft products on our ArduCopters. It would just crash it.. "Your ArduCopter made illegal device call and it will be terminated..." Ouch ;)

    Interesting document anyways.
  • mostly because sun light covers basicly all possible light wavelenght (with a few small gaps because of element Absorption)

    again it has othing to do with seeing the laser and not sun light its all about trying to give the sensor level of light as possible. this sensor is look for movement of features in its focal plane ( which is only about 6 inchs/ 15 cm across.at .5 meters) a lens can be used to defuse the lasers beam enough to feel that area and help even out shadows.
  • @Geoffrey - Why not use a laser beam and dedector other the the wavelenght of the sun?
  • we're not not trying to over power the sun, all you need to do is fill in the shadow caused but the airframe which takes very little power. if you lessen the shadow even by 50% you can get better Performance from the senor.
  • @Mathew: I know intuitively it seems like the laser would work, but sunlight is really really bright. There is on the order of a kilowatt (I think) of sunlight per square meter- think of solar energy. A 5mW laser won't compete. Try turning on your car's headlights to illuminate something the next time it is sunny- you'll be shocked just how dim they are.
  • using a laser to draw a pattern would not work but you could use the laser diode with a lens to fill the focal plane of the sensor with light very efficiently. that was you could wash out the shadow caused by your quad.
  • Thanks for the suggestions, I am always open for advise, I will look into those optics suppliers... When I built this thing, I pretty much used whatever I could find in my "random parts" drawer. I am sure performance could be improved with more carefully chosen optics and such.
    I have not at all systematically tested the brightness required to make the setup work, I just happened to always use outdoors during the daytime (more space - don't want to ruin my nicely painted walls indoors;-) ). I think it would be very difficult to find LEDs bright enough that fit on a quadcopter, though.
    Drawing a pattern onto the ground via laser might not work so well as the pattern itself would move with the quad.
  • Couldn't you pulse an IR laser with a certain pattern, right next to the camera sensor? If you'd only pickup the movement of the ground illuminated by the dot, maybe that could resolve the shadow problem?
  • Congratulations! I've never played with these optical mouse sensors myself but some people have and have obtained some nice results.

    If you don't mind suggestions: For the lens- For small quantities you can try Sunex (www.optics-online.com) who has lenses with focal lengths ranging from very small to large and you might be able to find one with a better focal length. Largan Precision (www.largan.com.tw) sells a more limited range of lenses but with an emphasis of volume (think iPhone). Both are based in Asia but Sunex has an office in the Bay Area and I've found them pretty helpful.

    It would be interesting to see how this works in lower light. The data sheet states a minimum intensity of 80mW/m^2, which (if my calculations are correct) corresponds to about 50-100 lux or so- with a fast enough lens this might work in a bright indoor environment but not when the lights are out.

    @GR0B: You will need very bright LEDs to overcome the shadow. A better approach may be a wider field of view so that the shadow is smaller, though this will blur out tiny features. LEDs *might* work for low light environments if you are hovering close to the ground.

    @Marco: Regarding ILS- Yes, as long as there is no shadow from the air vehicle (overcast or if sun is low).
    Sunex optics-online.com - CMOS miniature and fisheye lenses
This reply was deleted.