One reason I constantly check DIYdrones throughout the day is the tenancy to have well-written and informative blog posts on the front page, as opposed to some of the "hype machine news" found elsewhere, though we get our share as well. 

This blog post is really intended to be "just a blog post". I have multiple half-finished and in progress blog posts started (all relating to this topic), and I always want to add a little more substance and evidence first, in an effort to contribute to the quality of discussions I admire here. In this case, I feel that it's important to urge a little movement in a specific direction and offer up some signature rambling. 

I did not get into UAS/drones for agricultural purposes, but stumbling across the topic is inevitable. In the past year or so, most of my focus has been on the agricultural side of commercial UAS use. I have zero relevant background in this regard, which is exactly what qualifies me to write THIS specific blog post. While information on requirements and uses for multispectral cameras (ranging from low end to scientifically calibrated) is readily available in peer reviewed articles with case studies and examples, this information is often difficult to comprehend without dedicating a significant amount of time to learning it. 

Holding true to our philosophy at the fire department, I've been attempting a "The best way to learn is to teach" approach by making my contribution to this community a guide on NDVI, based on referenced and verifiable studies that typically don't show up in a google search. The conversations I've been having are frightening. 

The summary is "The NDVI capabilities of drones that is being marketed is, at best, a gross overestimation".

I really want that quote to sink in for a moment, because I have a different viewpoint on the implications. The bad side is that reputations like this have the potential to cause some involuntary career changes. The good side is that the problem can be solved by shedding some of our own ignorance, as the cause is really our lack of understanding. Thankfully, the above quote was followed up by a conversation about how some of the converted camera's many of us are using DO have a use, but it is critical to understand and implement some workflow steps, and there also needs to be an understanding of the biodiversity of the area of operations and how it pertains to the information you're trying to extract. That information needs to be understood to determine if your sensor's capabilities are able to extract it. 

With limited understanding, many of these conversations with people experienced in remote sensing and agriculture (but without a stake in the UAS industry) scared the hell out of me. I've had ongoing interactions with people from almost every data processing company I can think of, as well as a few manufacturers. I'm really rooting for everyone to succeed and didn't want to start citing data that is detrimental to many efforts, especially with limited understanding myself and far less on-the-line than others. I still maintain that viewpoint, and this is the basis of my reasoning for writing this blog. The wide-spread belief of "Farmers can save $____ with drone-collected NDVI" is possible, and it is possible with a variety of camera options...but it's not possible without a widespread understanding and accountability for the steps that need to be taken for this data to be useful. Every drone-related agricultural service is going to contribute to the reputation of capabilities of others, because right now it is all seen as the same cup o' tea. 

My initial plan for compiling this data for the average Joe attempting to get into a drones-as-a-service role has expanded to include much more information than I initially intended. I foresee this being a long process of learning myself and having the drafts reviewed by the people that have offered up their assistance, but I'd like to see some progress from others, big and small, to increase their understanding of the service being offered. You can justify the effort on the grounds of ethics, capitalism or self-preservation, but it's an essential step either way. 


E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones


  • Moderator
    And this has been the discussion on one of the subgroups here, without much concession on either side.
    Let's pretend, for the sake of argument, that NDVI aquired with uncalibrated cameras and all the associated issues has zero additional information than data acquired with an RGB camera. Even if that were true, but it creates data that is more easily defined and separated by automated programs for the creation of management zones, it's still a selling point.
    This is where the portion about credibility comes in. Being honest about the limitations and accepting that there is a difference between sensors and the methods of acquiring the data, but making sure a product isn't being sold as something it's not. 6 months ago, I had no idea there was any difference between them or why it mattered, and I read everything I could get my hands on...within the drone community. Stepping outside the drone world and fielding questions on capabilities and limitations is where you starting seeing problems with the claims being made and realize it has the potential to damage the credibility of its many uses.
  • T3

    Mario, that's the point! What is required is comparability of the R and NIR reflectance value. This is what allows for comparisons over space and time. I have tested various converted cameras with different filters and they are not comparable. The general trend is basically same but this can be seen in the RGB values as well.

  • Moderator

    James- I actually see a lot of uses for drones in agriculture, mainly in the situations where much higher resolution is requested for invasive species identification. The fact that there is a lot of value is exactly why we need to avoid claims that are questionable, or make sure it is performed to a yet undetermined standard. 

    Daniel - " A gross problem will be obvious from both datasets, but more subtle issues appear only in the NDVI." with a caveat added about not overusing data meant for a specific purpose, I think that would be a good description of what we've been doing. I feel that it's "good enough" to perform manual comparisons side by side with RGB and ground truthing, but I don't think the value is there until the data is accurate enough to be added to automated systems. 


    There are various needs for the data, and maybe the root of the problem is that our definition of NDVI is simply (NIR-R)/(NIR+R) .    If I'm selling NDVI maps with a converted camera and zero setup, FFC, that going to be the same quality or accuracy as, let's say Ben's NDVI maps using a radiometric calibrated camera and the required steps to actually extract an exact reflectance value for each pixel? 



  • Moderator

    I'm not saying that RGB shows the same info as NDVI, I'm saying that in our case, the value of RGB is equal to the value of images like the one above.  

    Ben, I'm not the one claiming that RGB shows the same data as NDVI, I'm claiming that NDVI is more complicated than we are initially led to believe and is meant for a specific purpose. 

  • Hi All,

    I greatly agree with the underlying thesis of this post, but to me the real problem appears to be getting solid meaningful data from the various sensors RGB or NDVI (or any valid spectral data for that matter).

    In Astronomy the use of color varies from somewhat representational to completely false color and it varies greatly according to what the astrophotographer is trying to accomplish.

    For Ag use you need solid metrics and with the assorted cameras and the assorted analysis systems out there, I suspect you have anything but.

    Basically you end up with a lot of pretty pictures which the so called expert uses software to modify and then to which he then claims to interpret X amount of valid data.

    Kind of like the Wizard of OZ and roughly as meaningful I would hazard.

    Basically "What we have here is a communication gap!"

    Assorted Cameras plus assorted Software plus assorted "Experts" = Well kind of a mess really.

    However, it also seems to me that the basic mechanisms are already there, all that has to happen is to develop them into a truly functional system with properly verifiable metrics and cross platform repeatable results.

    I am definitely not looking at this from the aspect of Ag expertise, but in these early days it just looks like a system problem and a bunch of under qualified snake oil salesmen.

    Need to get organized I think.



  • 3D Robotics

    @Mario I don't have a ton of time to spend digging up old image sets, but differentiating between the tall corn, medium corn, and short corn in the field in referenced here was near impossible using RGB imagery. Despite the title of my blog post, you should think of NIR/NDVI imagery as a magnifying glass than an X-ray. A gross problem will be obvious from both datasets, but more subtle issues appear only in the NDVI. See the images of the barley field in that post for a great example.

  • This 'blog post' is a whole load of nothing. You are saying "People who don't do things right, don't do things right", well dur.  That applies to anything at all, its not particular to NDVI on drones at all.

    To me it seems that the people who are claiming about inaccuracies in stitching or equally ridiculous claim that RGB shows the same info are in fact the ones who don't understand the process or the data.

  • In the UK i believe that Black Grass mapping is one of the few applications in agriculture that drones are useful for. I agree, there is a tonne of hype, which i why people need to diversify into other markets. NDVI is useful for the flood market to help flood modelers update old maps of impermeable areas (see below)3702103346?profile=original

    Working for a company who already uses this data is obviously an advantage. Find out more about our applications here: 

  • Moderator

    Hi Daniel,

    Thanks for the response and links. Our results have been similar to those you shared in the linked articles (especially considering that Agribotix processed the above image, and it was taken with an S100 as well), but a side-by-side comparison with RGB images shows that these regions are equally identifiable with VS. So our question was why are we adding this step if we're not obtaining any additional information? Additionally, I've seen enough inaccuracies in stitched NIR results, prior to an NDVI conversion, that make me hesitant to begin plugging this data into automated application systems. 

    I have found that RGB mosaics have been able to serve the exact same purpose, without needing to explain an often unfamiliar set of data, and examples like the blog above really need to include a side by side comparison. This is why one of my intended blogs has been stalled for awhile, as a replacement NIR S100 arrived with defects. 

    If you're interested in the peer-reviewed articles I've been trying to sort through, I can send them to you. They're slowly getting added and referenced as I go. 

  • 3D Robotics

    You should check out these blog posts I wrote last year. There is no question the ag drone industry is overhyped, but there are some very real and very useful applications for drones in ag that are being used right now.

This reply was deleted.