Position hold using ADNS2610 and webcam lens

Hi,

 

I am trying to build a position hold sensor using ADNS2610 and a cheap webcam lens. I manged to get a relatively focused image when i place the lens exactly in contact to the sensor. I have some very basic questions:

 

1. I do not know the FOV of the lens so how should i calculate the the position from dx/dy? If i follow the equation on the below link, i think FOV and resolution is needed to determine the real change in meters.

http://code.google.com/p/arducopter/wiki/AC2_OptFlow

 

2. How/Why is the IMU data integrated with the Optical sensor? How would the sensor compensate when the quad is really moving?

 

3. How should i calibrate the sensor data?

 

Thanks in advance and Best Regards,

 

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –

Replies

  • Hi,

    I've another issue now and it is related to the sensor itself. The sensor is behaving very unreliable and insensitive to changes. Most of the time it gives 0 change in dx/dy. I know it is working as when i turn its face toward window which has sunlight coming in, the number start to appear but as soon as i put it at a predefined distance and make it face the floor, it stops responding. What i am trying to do here is to calibrate it, i have a mark on the wall at 700cm and then i am trying to move it along the mark which is also 700cm wide. On some occasions though it seems to show some changes but it isn't consistent. Mostly it is very stubborn to changes and doesn't reflect those in dx/dy. Would you please tell if it is normal behaviour? If yes then how is it suppose to correct if it is so unreliable and insensitive? If not then please help me get the root cause of the problem.

    Regards,

    3692290470?profile=original3692290511?profile=original

  • Thanks Allen and Randy for valuable comments. While i am implementing it, i have couple of ambiguities which i would like to clarify. 

     

    1. While calculating the real distances using the equation given on the arducopter sensor link, is it raw sensor value (i.e. dx,dy) or it is the integral position (i.e X +=dx, Y+=dy)?

    2. While calculating the expected X/Y, does the change in roll means absolute change which is the shift from original (i.e initial leveled position) or it is the change over time (e.g change roll = roll at t2 - roll at t1)?

     

    Regards,

  • You can find the FOV of a camera experimentally pretty easily:

    First, put two objects at a known distance from each other. The "objects" in your case can be two strips of black vinyl tape. Make sure you get the exact distance between them, such as 1 foot or 1 meter, and call this w.

    Then, setup your camera to output its veiw to the screen in "real-time." The python script supplied here may be useful. (It's the same webpage you were looking at)

    Finally, move your camera, with its lense, progressively away from the targets until the targets go just off-screen. Measure the distance from the target to your camera and call this d.

     

    Now for some trigger-nometry:

    The FOV = 2*atan( ( w / 2 ) * d ) )

     

    Note that there are three types of FOV's used: horizontal, vertical, and diagonal. Horizontal would have you find when the targets go off the X-axis, vertical on the Y-axis, and the diagonal will be from the upper left pixel to the lower right.

     

    For question 2: The reason you integrate the IMU data with the optical flow sensor is to get a good, clean signal of what your craft is doing. I think the goals of sensor fusion goes along the lines of why it's a good idea to have eyes, ears, and a gut when your driving a super-car.

    You can make an equation that finds its relative ground velocity, and/or the rate the craft is tilting. You'd then use something like a complementary filter or a Kalman filter to fuse it with your IMU data... although I don't know the specifics on either.

  • Curious to see what answers may arrive as I'm interested in this approach but at an approximately fixed height. I don't suppose there's any way of collecting data under controlled conditions that would allow you to reverse engineer FOV? I don't know enough about optics.

    It just seems like one could find an equation that closely approximates actual dx/dy versus reported dx/dy given height z. With enough data one could curve fit the points and come up with a reasonable approximation for a range of reasonable heights... ?

This reply was deleted.

Activity

Santiago Perez liked Santiago Perez's profile
Saturday
More…