T3

Hi all,

in the subject of mapping from the air, it is important to know how many pixels fall per cm on the ground.

Following

http://en.wikipedia.org/wiki/Angle_of_view

425px-Angle_of_view.svg.png

we got the following angles of view from equivalent focal length and we can compute aspect ratio from them:

 

Focal Length (mm)24283543.350
Vertical (°)53.146.437.83127
Horizontal (°)73.765.554.445.139.6

1.387947271.411637931.439153441.454838711.46666667

 

On the other hand you have sensor resolution:

Canon S90 is  3648 x 2736 = 9980928 roughly 10MPIX

this is aspect ratio 3648/2736=1.333...

It is said to be equivalent to focal length 28...105mm.

 

On the other hand all aspect ratios calculated above are well above 1.33.

 

Conclusion: even standard lens are generating wider view than the sensor is,

resulting in less horisontal than vertical resolution.

True or false?

 

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • There are a few online field of view calculators available, here is a link to one: http://www.tawbaware.com/maxlyons/calc.htm
    http://www.tawbaware.com/maxlyons/calc.htm
  • If a rough estimate (using chip size and focal lenght) isn´t good enough I would go for a simple geometric calibration. I mean like really simple... Print two crossed rulers on a page, set it up in a known distance centered in front of the camera. Then a little bit of math...
  • T3
    I am only asking because of mission planning.
  • You are now entering the world of photogrammetry!!

    All of the problems you mention (and much more) have already been solved by these people (there are whole faculties for this topic...).

    A search for "digital photogrammetry" should give you many hits and a good reading for some days :-)

    Very demanding math sometimes but it gives you a feeling for the processes needed to e.g. produce orthorectified pictures and map projections.

  • It's be great if we could use this thread to lay down some useful math:

    Basic projection math may be simple (given ray hits ground at tan(x) or similar)
    But what is the math for georectification in the language of a graphics processor - which turns this around and solves (at the surface point) what is the color (or image xy) of the pixel _AT_THIS_POINT?
  • The table from the wikipedia article is for a sensor with a 3:2 (i.e. 1.5) aspect ratio which is standard for DSLR type cameras, and is the same as was provided with 35mm film cameras. Many digicams (apparently including the D90) work with 4:3 (i.e. 1.33) aspect ratios.

    The reason the numbers in the table don't work out to exactly 1.5 is that the distance on the image plane (i.e the sensor) is proportional to the tangent of the angle, not to the angle itself. This becomes significant for wider angles (i.e. shorter focal lengths) - if you looked at values for f past 50mm you would find that they converged on 1.5.

    Fisheye lenses are different. They do have a linear relationship between distance on the image plane and angle, but this ends up transforming straight lines into curves when you look at a picture taken with such a lens.

    On the other hand, assuming that the camera is aimed straight down, the relationship between distance on the image plane and distance on the ground will be linear so that, for a given altitude, one pixel will always represent the same number of cm on the ground both horizontally and vertically. If the camera is aimed forward rather than down then the pixels will be stretched along the ground in the direction the camera is aiming, with the effect being very significant for shallow angles towards the horizon.
  • Maybe because people like more wider pictures than taller ones?
This reply was deleted.