Some people asked for a few more details on the mouse sensor position hold I am experimenting with on my quad (originial post here), so here we go:
As I mentioned in the original post, the sensor is an ADNS 2610 made by Avago. This is a fairly ubiquitous sensors found in many cheap mice - the reason I chose this one is two-fold: there is good documentation available for it (e.g. the datasheet, or just google for it, there are plenty of people who hacked it), and it provides a mode in which the 'raw' 18x18 pixel image can be read out of it, significantly simplifying the setup of appropriate optics.
I ended up not finding a mouse with that particular sensor, so I ordered the raw sensor on Digikey instead. According to the datasheet, 3 caps and 1 oscillator are required in a minimal circuit - all of these are cheaply available at digikey.
Above is an image of my prototype circuit: on the right you see a lens with a screw-type lens barrel from a cheap (around $10 at Walmart, I think) keychain camera, in the center you can see a raw ADNS 2610 and on the right is the circuit itself. The lens I ended up using is not this exact one, but one from a different camera which looks very similar - either one would have worked.

On the second image you can see the sensor again (this time upright), as well as the bottom of the pcb (never mind the sloppy soldering). In the prototype, I basically hot-glued the lens barrel lined up to the bottom of the board and used the screw-mechanism to adjust the focal length (more on that below). My final design looks very similar, except that I used a single-piece perforated circuit board and drilled a 1/4" hole in the center for the sensor aperture - I did not include a picture because it is wrapped in a small, hot-glued foam box and mounted on the quad at the moment, and I am too lazy to take everything apart.

The image above shows the raw image data output of the mouse sensor with the optics adjusted. You can see my hand in front of a bright wall, making the hand itself appear dark. The distance sensor-hand is approximately 0.5 meters (give or take). The image is read out from the sensor via an Atmega which passes it on to a Perl/Tk script via serial. The serial protocol used by the ADNS 2610 is documented in the datasheet and fairly easy to emulate on a microcontroller. Since I knew that the keychain camera lens must have a focal length of a few millimeters, I basically determined the correct focal length by trial-and-error (I pointed it at a dark shelve on a bright wall and adjusted until it was clearly visible). Because the focal length is short compared to the object distance (~5mm versus something of the order of 1m), this setup does a reasonable job at focusing anything further away than 0.5-1m. On the actual quad, I do not look at the raw image data, but let the optical flow implementation on the mouse sensor to the hard work for me and just read x- and y- displacements.
I use the altitude data from the sonar range finder together with tilt information from the IMU to get an absolute position reference.

The mouse sensor rigged in this way appears to work quite well in daylight conditions outside, I have not tried to use it indoors yet (I would imagine one needs a reasonably well-lit environment). I did notice once that the quad seemed to get slightly confused by its own shadow in bright sunlight around noon, but I have not systematically tested this. You can see a screenshot of my GCS during a typical flight above. The window on the left just shows the attitude and other assorted information about the quad (the software is written in Perl/Tk - I mostly use Linux). The window on the right shows the ground-track in position-hold mode as detected by the quad. The blue bar indicates one meter, the trace starts in the center of the cross-hair. It did a reasonable job holding its position, it was a tad windy that day, I think. I am sure with some patience one could fine-tune the PID parameters to tighten the position-hold considerably.
Hope this was interesting,
E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones


  • thank a lot

    best regards

  • Hi Sept,

    My project described here is completely unrelated to ArduPilot, but I will try answer your question.

    An ideal lens with focal length f will produce an image of an object at distance o from the lens at distance i on the opposite side of the lens, where the three value are related like this: 1/i + 1/o = 1/f (try googling 'lens equation' or something like that). In my use case f is much smaller than o (the lens I am using has a focal length of a few mm, the quad usually flies many hundreds or even thousands of mm high...), so the image is roughly focused at distance i~f.

    If instead you are asking about compensating for the tilting of the quadcopter and how to compensate for it, that is mostly a matter of simple trigonometry. For small tilt angles, the correctional offset to be applied to the image should be roughly proportional to the angle (with some appropriate factor).

    Hope this helps...

  • hi marck
    I'm looking for how I can measure surface with ArduPilot and camera I find your article and wondered if you knew the relationship between the focal length of the camera, the altitude and the surface measure

    many thank

  • I saw the Sparkfun breakout board - it would probably work, but for using it on a quadcopter, it almost seems unnecessarily large. You can get the raw sensor to run with just three small capacitors and one ceramic oscillator.

    For my particular purpose I do not need the optics provided by them or the LED.

  • Developer

    you know you can actually buy a breakout board for this mouse sensor from Sparkfun!


    Any thoughts about using that?

  • if it works on software site why would one use mechanical units which always cause errors?
  • Thanks for explanations.For the tilt part it looks like you are performing some kind of electronic image stabilization.Maybe you can add to the mouse sensor some servos and stabilize mechanicaly the sensor for tilting
  • @ionut: My calibration methods were rather crude so far: I walked holding the quad as straight as possible at a fixed height (18" and 36" measured with a yard-stick) on a pre-measured course (marked with rocks in my driveway ;-) - something like a square with 72" sides) and recorded the measurements from the mouse sensor via Xbee telemetry on my laptop. I did this a couple of times and took a rough average of my measurements. So the surface was relatively featureless black top with a bit of dirt on it. I haven't tried other surfaces, I imagine something like grass might be better for the sensor... Adjusting for height is actually a matter of simple geometry: since I use a fixed lens, the actual distance above ground traveled is simply proportional to the distance measured by the sensor times the height (measure by the ultrasound range finder).
    The focal length of my optics is roughly 4-5mm, it is 'tuned' to give me a focused image at ~1m - if you plug this into the lens equation 1/focal_length = 1/object_distance + 1/image_distance, you can see that the image distance (i.e. the distance behind the lens at which the image is in focus) hardly moves at all for object distances that are much larger than the lens focal length. Long story short: I basically get a reasonably "in focus" image for all distances larger than ~1m.
    Since the measured distance is only used as part of a PID loop, the exact units of the distance measurement don't matter at all - I could leave them in "mouse units"xheight if I wanted to...
    The trickier part was compensating for changes in tilt (those will look like a movement to the mouse sensor even if the quad is stationary). Basically, the approximation I use is that the aperture of my optics makes the mouse sensor "see" along a cone with fixed opening angle (judging from the hand picture I posted, maybe 10-15 degrees). Therefore I add/subtract a number of "mouse units" given by the change (not absolute!) in tilt angle in the two directions pitch and roll (the sensor is aligned with those) times some constant of proportionality (notice that the height above ground does not enter here - that happens later). The constant could be determined by measuring the opening angle of said cone and using the resolution of the sensor, but I just measured it via trial-and-error (tilt the quad around by small angles while holding it in place and record the angles and mouse sensor measurements)...
    Anyways, enough rambling, I really need to find some more time to do more PID tuning on it.
  • So maybe an issue will be to calibrate the mouse sensors.I mean how do you convert mouse displacement from sensor to real meters.Just by trial and error?How do you adjust this conversion based on altitude?
  • Marko,
    You are right.Your implementation is quite nice and it's worth trying.You should try in on different texture:grass,sidewalks.Also do you have a method to test the accuracy?I mean can you move the quad to a certain distance and check the estimation from your sensor for a certain altitude?I think the higher you get 2 pixels on your image will represent bigger distance,unless you use special lenses to zoom.Did you notice if there is an optimum distance from the ground based on your lenses(1m?)
This reply was deleted.