The right tool for the job

There have been recent posts on the “wall” about scientific and “toy” cameras for mapping. The focus is on NDVI which is simply an index that provides information about the difference between reflected red and near-infrared radiation from a target. It's an index because it is unitless and it is normalized so values always fall between -1 and +1. It tends to be a good indication of plant vigor and has been correlated to different aspects of plant productivity.

In any digital camera that I'm familiar with a pixel starts its life as a voltage. The next step is where the the scientific and point-and-shoot cameras diverge. In a scientific camera voltage is simply calibrated to output radiance and in a point-and-shoot camera it follows a more complex processing path to output something pleasing to the human eye. Scientific cameras are trying to measure physical variables as accurately as possible and point-and-shoot cameras are trying to make a good looking photograph – science vs art. Point-and-shoot cameras are more complex than scientific imagers but they use lower quality and mass produced parts to keep the costs down whereas the scientific cameras use precision everything which are produced in low volumes. That's a brief summery but the bottom line is that the two different cameras are designed for different uses. To imagine that a camera designed for making pretty pictures can be used for scientific studies seems a bit ludicrous – or does it? It' depends on what you want to do.

There is a good bit of work going on to try and convert point-and-shoot camera from an art tool to a scientific tool. This is an area that fascinates me. I realize there are serious limitations when working with low quality sensors and imaging systems but some (perhaps many) of those radiometric and geometric imperfections can be modeled and adjusted using calibration techniques and software. For example, there are a few articles in the peer-reviewed literature about people calibrating commercial digital cameras (usually DSLRs) to record radiance and the results are pretty encouraging. I have been developing my own work flow to calibrate point-and-shoot cameras although I'm using simple DIY approaches since I no longer have access to precision lab equipment that would allow me to more accurately characterize my cameras. If anyone is interested I post my calibration experiments on the Public Labs web site (http://publiclab.org/). I'm always looking for feedback to advance this work so comments are welcome. My intent is to convert simple cameras to the best scientific tools that is possible.

When deciding which instrument to use you need to consider the goals of the project and available financial resources. For the financial resources you need to consider purchase cost, maintenance and replacement costs if it gets damaged. There is no comparison from a cost perspective. On the bargain side of scientific imagers you should expect to pay a few thousand dollars and if you want a large format mapping camera it's in the ball-park of $1 million. The precision/scientific-grade cameras are very expensive, require careful maintenance and recalibrating (can also be costly), and if you have one in a UAV that crashed you will likely lose a lot. You can get a used digital camera and convert it to an NDVI capable imager for well under $100 or purchase one designed for mapping like the Mapir for about $300.

What about accuracy, precision and stability? Clearly instruments designed with these qualities in mind will be better than something made to make pretty pictures. A more appropriate question is what is good enough for our purposes? I'll focus on NDVI mapping and it's important to realize different applications (e.g., creating 3D point clouds, ortho-mapping, land cover classification) will have other qualities to consider. One important factor to consider is radiometric accuracy. Although I'm trying to improve what we can get from point-and-shoot cameras I realize I will never attain the accuracy or precision possible with scientific imagers. How important are radiometric qualities for NDVI mapping? In most of the applications I see on this and similar forums people are mostly interested in relative changes in NDVI throughout an image and not absolute NDVI values. Some folks want to monitor NDVI over time and in that case it's important to be able to standardize or normalize NDVI but that is possible with calibration work flows. For these applications a well designed and calibrated point-and-shoot cameras can perform good enough to provide the information required such as to spot problem areas in an agricultural field. One point that is often overlooked is that close-range imaging and NDVI typically do not go well together. The problem is that we are imaging scenes with leaves, stems and soil and at the fine resolution provided by most point-and-shoot cameras we are trying to get the NDVI values from very small areas on the ground. For example, we can see different parts of a leaf and each part of the leaf is angled somewhat differently which will effect the NDVI value. Our scenes tend to be very complex and you can have the most accurate and precise instrument available and you might still be disappointed because of the physical issues (bi-direction reflectance, mixed pixels, small area shadows...) that create noise in the images. It is certainly nice to reduce as many sources of noise as possible but with a scientific camera I'm not convinced (at least not yet) that the improved radiometric performance is significant enough to overcome all of the noise coming from the scene to justify their use.

As far as the Mapir camera I received one of these last week and am trying to set time aside to calibrate it and see how well it performs. My initial reaction is that it is a nice compact camera well suited to small UAV mapping. I would prefer a red dual-pass filter but I expect that and other enhancements will become available in future versions. I like the fact that someone is focused on developing practical low-cost mapping cameras.

I welcome any comparisons between cameras and hope we can work together to improve the output we get from simple inexpensive cameras.  

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –

Replies

    • Mario: yeah sure... I only flew it twice in manual mode so far, and I'm yet trying to improve the catapult launch, which is not, let's say, "repeatable" yet :)  Hopefully I'll have some flight data to share in the next few weeks.

      Martin: I was thinking the limiting factor would be shutter, but maybe it is indeed the 3-second interval. Doing an analysis based on the specs on the MAPIR website:

      • At 120m AGL, they say the recommended speed is 9m/s
      • GSD = 6.83cm/pixel at 120m
      • Resolution is about 4k x 3k pixels
      • One picture every 3 seconds
      • Recommended front overlap of 85%

      Hence we can conclude that, @120m AGL:

      • Image size over ground is about 273 x 205m
      • 85% overlap in the direction of flight means the camera moves either 41m or 30m between each photo, depending on the camera orientation inside the aircraft
      • Long side of the image in the direction of flight: 41m in 3 seconds = 13.7m/s
      • Short side of the image in the direction of flight: 30m in 3 seconds = 10m/s

      The last number above is really close to the recommended 9m/s, so maybe the reason is indeed the 3-second interval. That might also mean a few things:

      • It might be possible to use it at ~14m/s by changing the camera orientation inside the aircraft
      • One could use multiple MAPIRs onboard to overcome the 3-second limitation - they are light so it might be doable
      • Reducing overlap to 80% or 75% could have a big effect in maximum flight speed

      So it seems there is some hope that I might be able to use it in the X5, after all :)

    • I'm wondering why there's a speed limit. Is it because of the 3 second interval or because of the rolling shutter?

  • Moderator

    I just wanted to post that I'm trying to put a lot of the information in this thread into a sort of study guide for myself (slowly). If I can keep it organized, I'll submit it to the wiki and cite the authors, if allowed. 

    There is a lot of good info in here and I didn't want you lot to think it was sinking.

  • Keep it coming guys, I'm trying to soak up as much as I can.
  • Moderator

    This is a quote from some back-and-forth emails with someone I would consider qualified to speak on sensor requirements for agriculture, which I also just posted in the Micrasense thread. 

    I'm going to ask if he minds if I quote him and will edit in his details if it's alright.

    EDIT: Sounds like he's going to join the conversation in the group.

    See John Sulik's response above.

    • Thanks for posting. This is great - quite informative. I'm curious what the logic is behind "NIR band to the red and green color channels and the visible band to the blue channel" - a blue filter. Is it the NIR sensitivity of the camera sensor in the red? I'll also echo the comment about Chris Fastie's white balance methods. He does that primarily to be able to make decent NDVI (by reducing the visible channel response relative to the NIR channel) and nice looking false color composites with little or no post-processing.

      I'd also like to hear thoughts on acquiring raw vs JPEG. I prefer raw since the pixel values are linear with regard to radiance but there are practical limitations to shooting in raw so am working on methods to process JPEGs so they have a linear response with reasonable but mixed success.

    • Moderator

      Hey Ned, John Sulik said that he is willing to join the discussion. I sent him a link, so hopefully we'll see a response today. I'm working on setting up his recommendation for a ground target for flat field correction. 

    • It does not make sense to include the green channel for false color composites. Therefore, just assign 
      NIR to Red, Visible (red or blue) to Green, Visible (same as assigned to Green) to Blue. The result looks more like traditional false color infrared imagery (see pic below - blue is bare soil red is wheat near jointing).3702796027?profile=originalWith respect to jpeg versus RAW, see: 

      Can commercial digital cameras be used as multispectral
      sensors? A crop monitoring test
      V. Lebourgeois

    • Thanks John. For my question about band assignment I was actually wondering if you were recommending a blue filter ("visible band to the blue channel") instead of red but after rereading my question I realize it was very poorly worded. Anyway I would be interested in your thoughts about why so many people are using and encouraging the use of blue filters for single-camera setups. It doesn't make much sense to me but it's likely I'm missing something. Another thing I don't understand is why one would select DVI (NIR - Red) over NDVI. To put these questions into context I come from a more traditional remote sensing research and applications (mostly ecological) background. My experience with agriculture is mostly saying "yep - that's agriculture" or trying to understand fallow dynamics. 

    • Theoretically, a blue band will not be as sensitive to variability in chlorophyll absorption as a red band because carotenoids and chlorophyll both absorb blue light whereas mostly just chlorophyll absorbs red light. Therefore, when chlorophyll absorption decreases a blue  band will not be as sensitive because carotenoids keep absorbing light. In the right side of the graphic below, I used ProSAIL to simulate the spectral reflectance of an erectophile canopy for two different levels of chlorophyll while holding constant leaf area. 

      3702884665?profile=original

      For the other half of the NDVI equation, if you look at the blue-filter-derived NIR image (posted above in earlier reply) it does not approximate the Tetracam NIR band nearly as well as the red-filter-derived NIR image. I mounted 2 Canon A2200s and a Tetracam on a Cinestar-8 and acquired these images nearly simultaneously. In addition, I measured the leaf area index for several 1/2 sq m areas in those images with a Decagon LP-80. I flat-field corrected all the images with a Teflon panel that is visible in each image. If you look at the results below, you will see that the range of NDVI values for the Canon with the blue filter is not as great as the Canon with the red filter or the Tetracam. This tells me that the blue filter is not as sensitive to variability in the biophysical attributes of the wheat canopies. Therefore, I recommend using the red filter instead.

      3702884757?profile=original

      Why do so many people encourage the blue filter? One very weak rationale is given here, in which the author argues that the differences between a blue and red filter are inconsequential because the difference between a red and blue channel are less than the difference between NIR and any visible channel; however, that does not entail that there are no meaningful differences between visible bands. In fact, there are meaningful differences that I demonstrated above (similar graphs can be found in the literature) and you (Ned) drew attention to in PublicLab posts. In addition, advocacy for a blue filter may simply result from path dependence; didn't the PublicLab folks originally encourage use of a blue filter? 

      If you want to know why a difference index is often employed for low-altitude imagery then read the entire page that I linked above. The rest of the article is pretty good. I will summarize it briefly in case you don't want to read it all:  Shadows are more frequently encountered in very high spatial resolution imagery. The summation in the denominator of a normalized difference index can be quite small in shade and this can cause anomalously large index values, or simply bizarre behavior. 

       

This reply was deleted.