Hi,
Some people asked for a few more details on the mouse sensor position hold I am experimenting with on my quad (originial post here), so here we go:
As I mentioned in the original post, the sensor is an ADNS 2610 made by Avago. This is a fairly ubiquitous sensors found in many cheap mice - the reason I chose this one is two-fold: there is good documentation available for it (e.g. the datasheet, or just google for it, there are plenty of people who hacked it), and it provides a mode in which the 'raw' 18x18 pixel image can be read out of it, significantly simplifying the setup of appropriate optics.
I ended up not finding a mouse with that particular sensor, so I ordered the raw sensor on Digikey instead. According to the datasheet, 3 caps and 1 oscillator are required in a minimal circuit - all of these are cheaply available at digikey.
Above is an image of my prototype circuit: on the right you see a lens with a screw-type lens barrel from a cheap (around $10 at Walmart, I think) keychain camera, in the center you can see a raw ADNS 2610 and on the right is the circuit itself. The lens I ended up using is not this exact one, but one from a different camera which looks very similar - either one would have worked.

On the second image you can see the sensor again (this time upright), as well as the bottom of the pcb (never mind the sloppy soldering). In the prototype, I basically hot-glued the lens barrel lined up to the bottom of the board and used the screw-mechanism to adjust the focal length (more on that below). My final design looks very similar, except that I used a single-piece perforated circuit board and drilled a 1/4" hole in the center for the sensor aperture - I did not include a picture because it is wrapped in a small, hot-glued foam box and mounted on the quad at the moment, and I am too lazy to take everything apart.

The image above shows the raw image data output of the mouse sensor with the optics adjusted. You can see my hand in front of a bright wall, making the hand itself appear dark. The distance sensor-hand is approximately 0.5 meters (give or take). The image is read out from the sensor via an Atmega which passes it on to a Perl/Tk script via serial. The serial protocol used by the ADNS 2610 is documented in the datasheet and fairly easy to emulate on a microcontroller. Since I knew that the keychain camera lens must have a focal length of a few millimeters, I basically determined the correct focal length by trial-and-error (I pointed it at a dark shelve on a bright wall and adjusted until it was clearly visible). Because the focal length is short compared to the object distance (~5mm versus something of the order of 1m), this setup does a reasonable job at focusing anything further away than 0.5-1m. On the actual quad, I do not look at the raw image data, but let the optical flow implementation on the mouse sensor to the hard work for me and just read x- and y- displacements.
I use the altitude data from the sonar range finder together with tilt information from the IMU to get an absolute position reference.

The mouse sensor rigged in this way appears to work quite well in daylight conditions outside, I have not tried to use it indoors yet (I would imagine one needs a reasonably well-lit environment). I did notice once that the quad seemed to get slightly confused by its own shadow in bright sunlight around noon, but I have not systematically tested this. You can see a screenshot of my GCS during a typical flight above. The window on the left just shows the attitude and other assorted information about the quad (the software is written in Perl/Tk - I mostly use Linux). The window on the right shows the ground-track in position-hold mode as detected by the quad. The blue bar indicates one meter, the trace starts in the center of the cross-hair. It did a reasonable job holding its position, it was a tad windy that day, I think. I am sure with some patience one could fine-tune the PID parameters to tighten the position-hold considerably.
Hope this was interesting,
Marko
E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • I agree, we have to find or create an open system..

    By the way I found these when searching an camera based rangefinder:

    http://www.asl.ethz.ch/education/master/mobile_robotics/Exercise_3.pdf

    http://staff.science.uva.nl/~bredeweg/pdf/BSc/20082009/Nguyen.pdf

    An omnidirectional camera finder can be a good candidate for a collision detection system.
  • Developer
    HeliCommand has similar, yes there are many of those but they all are closed systems and we should try to find open system that we can hack and exploit as much as possible for everyones benefit :)

    That Flymentor looks rather ok too for heli use. I had some really bad experience with helicommand about 4 years ago when I tested one of their product... heli went crazy on my table, good thing was that I did not have main blades connected but still it made some ugly damage... Becarefull with those, they are NOT toys..
  • Here there is a description for Flymentor 3D:

    http://www.helipal.com/kds-flymentor-3d-auto-stabilizer.html

    What is Positioning Mode?
    In this mode, there will be solid green light on the Flymentor Mixer, the CCD Camera and the 3-Axis Gyro are both working. The CCD Camera will constantly taking pictures of the ground, and compare each frame, to see if the helicopter is drifting, then it tells the Flymentor Mixer to give correction to all cyclic servos. Resulting the helicopter will stay in a Fixed position. The concept is basically an Eye watching the ground and make adjustment, kinda like a real human. But note that the system is not hooked up with the throttle, the attitude of the helicopter is still in your control.

    About the CCD Camera
    Used for "Positioning Mode". The CCD Camera should be mounted underneath the main shaft or on the side frame of the helicopter, with the lens pointing at the ground and free from vibration, it works best in the attitude of 1m to 3m. The CCD Camera's job is to take image of the ground, then to provide a signal to the Flymentor Mixer, and gives the correction to the cyclic servos. To have best result, make sure that there are enough "Light" and "Something" on the ground surface, like grass and gravels, because the Camera can't see very well in dim condition and it can't recognize flat snow or flat sand, it requires "Something" to see, it will not work in dim condition or low contrast color surface.
    - HeliPal
    Buy / online at the lowest price. HeliPal is famous for their Fast Shipping, Low Price and Good Service. HeliPal is the best and the biggest rc hel…
  • @Jose: I already do that to some extent - the code uses a PID loop for controlling the position (input is position deviation in x/y direction, outptut is roll/pitch angle). At the moment the velocity that goes into the D-part is obtained as a 50/50 mix of the velocity obtained from the mouse sensor (i.e. change in position since last timestep) and the velocity from integrating the accelerometer data after correcting for gravity...
  • Developer
    Very interesting work...
    Maybe a fusion between this sensors with accel sensors (extracted x and y accel components from IMU) could be a more robust solution...
  • Very interesting discussion going on here, but I think I may have overstated the shadow problem: I had one situation where I thought the shadow may have confused the quad, but even that was not really reproducible. By and large the sensor seems to work just as expected as long as it is light enough...
  • @Matthew: How about rather than solving the problem with light, instead solve the problem by using image processing to get rid of the shadow?

    @Öncü: Good find with that paper! Thanks! Though it would be interesting to see what moving shadows do to these techniques.

    @ionut: It would be interesting to see if those 4 wires coming out the image sensor are I2C lines...
  • I think there is already a product out there:
    http://www.hobbyking.com/hobbycity/store/uh_viewitem.asp?idproduct=...
  • An approach like this (downward facing camera) can solve the problem of constant velocity translational drift. But a range finder is definately needed for aerial usage.
  • "Are you sure to fly?" :)
This reply was deleted.