Moderator

3689668051?profile=original

One reason I constantly check DIYdrones throughout the day is the tenancy to have well-written and informative blog posts on the front page, as opposed to some of the "hype machine news" found elsewhere, though we get our share as well. 

This blog post is really intended to be "just a blog post". I have multiple half-finished and in progress blog posts started (all relating to this topic), and I always want to add a little more substance and evidence first, in an effort to contribute to the quality of discussions I admire here. In this case, I feel that it's important to urge a little movement in a specific direction and offer up some signature rambling. 

I did not get into UAS/drones for agricultural purposes, but stumbling across the topic is inevitable. In the past year or so, most of my focus has been on the agricultural side of commercial UAS use. I have zero relevant background in this regard, which is exactly what qualifies me to write THIS specific blog post. While information on requirements and uses for multispectral cameras (ranging from low end to scientifically calibrated) is readily available in peer reviewed articles with case studies and examples, this information is often difficult to comprehend without dedicating a significant amount of time to learning it. 

Holding true to our philosophy at the fire department, I've been attempting a "The best way to learn is to teach" approach by making my contribution to this community a guide on NDVI, based on referenced and verifiable studies that typically don't show up in a google search. The conversations I've been having are frightening. 

The summary is "The NDVI capabilities of drones that is being marketed is, at best, a gross overestimation".

I really want that quote to sink in for a moment, because I have a different viewpoint on the implications. The bad side is that reputations like this have the potential to cause some involuntary career changes. The good side is that the problem can be solved by shedding some of our own ignorance, as the cause is really our lack of understanding. Thankfully, the above quote was followed up by a conversation about how some of the converted camera's many of us are using DO have a use, but it is critical to understand and implement some workflow steps, and there also needs to be an understanding of the biodiversity of the area of operations and how it pertains to the information you're trying to extract. That information needs to be understood to determine if your sensor's capabilities are able to extract it. 

With limited understanding, many of these conversations with people experienced in remote sensing and agriculture (but without a stake in the UAS industry) scared the hell out of me. I've had ongoing interactions with people from almost every data processing company I can think of, as well as a few manufacturers. I'm really rooting for everyone to succeed and didn't want to start citing data that is detrimental to many efforts, especially with limited understanding myself and far less on-the-line than others. I still maintain that viewpoint, and this is the basis of my reasoning for writing this blog. The wide-spread belief of "Farmers can save $____ with drone-collected NDVI" is possible, and it is possible with a variety of camera options...but it's not possible without a widespread understanding and accountability for the steps that need to be taken for this data to be useful. Every drone-related agricultural service is going to contribute to the reputation of capabilities of others, because right now it is all seen as the same cup o' tea. 

My initial plan for compiling this data for the average Joe attempting to get into a drones-as-a-service role has expanded to include much more information than I initially intended. I foresee this being a long process of learning myself and having the drafts reviewed by the people that have offered up their assistance, but I'd like to see some progress from others, big and small, to increase their understanding of the service being offered. You can justify the effort on the grounds of ethics, capitalism or self-preservation, but it's an essential step either way. 

 

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • Moderator
    Thanks Alexander. I don't suppose you've done a comparison on the unaltered pixel value for a point in multiple images, and compared it with the blended result after for different options?
  • Hey Mario- In my experiments processing raw data in agisoft, the stitch/blend option that you choose (min/max intensity, average, etc.) makes a drastic difference in the way your mosaic turns out. Some may come out with more or less contrast, etc. That could impact your pixel intensity values. 

    Yeah James its true. I think, in most cases but not all, multispectral is not good enough for early disease detection for site specific fungicide/herbicide application at the moment. But still has some promise in other areas such as biomass and yield estimation. When lightweight hyperspectral and thermal sensors get cheaper over the coming years things will really get interesting.  

  • I agree, there are plenty of possible applications with drones...however actually selling the service/ platform etc for many of these applications is the hard part. You have to prove a business case to the farmer/ agronomist and this is something we are currently doing at Remote Aerial Surveys. Many agronomists are coping well by surveying their crops on foot in the UK and so you need to be able to show them some numbers before they can justify spending £10,000's on surveys or equipment. The main application where there is a business case in the UK is mapping black grass, but even still, its a hard sell without numbers. My algorithm is able to automatically able to highlight areas of black grass but requires some statistical analysis before it will work with a different camera. If anyone is interested in finding out more, feel free to PM me :)

    Farmers are generally like sheep. Once 1 has taken the plunge, others will follow. So yes, i think there is a huge potential market, but it will take several years before people start clawing back their investment. By this time, the big boys will have come in, bought many of us out, and reap the rewards :) Precision hawk are a good example

  • Moderator

    Thanks Alexander. I've had mixed experience with different processing companies as well, but that is mainly based on the appearance of the final product. I am curious how stitching / blending affects the final product, and I would also like to have the ability to create different indices. I'm still trying to learn how to adapt the imagery to our specific environment. 

    Here are a few links to some of the articles I'm trying to sort through, just in case anybody is interested. 

    Also, here are some articles if anyone is interested. This is what I'm (slowly) pulling information from.

    http://www.ipni.net/publication/bettercrops.nsf/0/FFCA6C5FBE6875808...$FILE/BC%202014-3%20p7.pdf

    http://agri-sensing.technion.ac.il/Lectures%20PDF/FC/Long%20Wednesd...

    http://www.researchgate.net/publication/43275246_Combined_Spectral_...

    http://www.mdpi.com/1424-8220/8/11/7300/htm

    Fine Tuning Remote Sensing Technologies for Nitrogen Application in Semi-Arid Cereal Crops
  • Each camera, especially when hitting the NIR bands, typically has a unique bandwidth. So comparing between cameras, even with their NDVIs, you will be getting slightly differing visual results when considering that you are analyzing NDVI from different bandwidths. Like, is your "NIR" channel just hitting the red edge region? Or is it hitting a little higher?  What constitutes the true NIR region? Even scientific studies have conflicting ideals but the NIR spectral region and where it stops and starts. 

    With my data from this summer, I found a lot of the time in barley we could visually distinguish between diseased and healthy barley without the need of red edge and nir bands. I could use intensity, hue and saturation-based classifications to separate crops close to harvest. That;s not to say that they are not completely useless- I was able to also detect problem areas in the field a lot better between 3 different levels of disease with RGB-NIR indices- better than just RGB. 

    BUT, I think that this article does have a lot of merit. How do we properly interpret NDVI? Intensive groundtruthing is often needed. A low NDVI value will be useless to look at in an ortho unless you know what's on the ground. Soil can pose as a low ndvi value just as much as severe disease. I think something else to think about is how are these cloud processing companies delivering the data? I find one company doesn't give optimal solutions with their final product- and I end up having to manually process myself to extract better info. My two cents haha 

     

  • Moderator
    Ben,
    How exactly do YOU "normalize to the same set of values"?
  • T3

    Hi Ben, not sure what comment you are referring to...

    The problem with converted cameras and filters is that the relationship between the (contaminated) R and IR bands to real R and IR reflectance values is highly non-linear in some cases. If you have a pure R and a pure IR channel from a two camera setup, then for sure you are right.

  • This seems a bit like saying photographs are useless for comparison unless every photographer uses the same lighting conditions with the same calibrated exposure settings.  It simply isn't the case.  They just need to be normalised to the same scale of values using known common reference points.

  • T3

    Btw: we should listen to the framers very carefully before we develop products and services. They have a pretty good idea of what they need. And this is complex because vegetation health is only one small part of the story. There is climate, soil and terrain as well. They also need integrated workflows and something that is easy to operate an robust. Otherwise they won't buy it because they do not have time for R&D in most cases.

  • T3

    I guess there are (at least) two things that need to be separated.

    1. Uncalibrated cameras with different filters are not comparable.
    2. The VI vales of an uncalibrated camera can be used to determine changes in vegetation over time.

    To be able to determine changes in vegetation over time using some index can be selling point. But a damage of credibility can easily be caused if the VI values of an uncalibrated camera are sold without communicating the restrictions. 

    It all boils down to $$$. If you want to sell something to a framer it must be cost-effective. And this is not only the camera(s) but the UAV, the SFM software, the GIS, and the GPS setup. This can easily sum up to 30000$. Then still you need hell of a lot of experience to make the whole thing operational. But if you can provide an inexpensive system and if you communicate the constraints as well as the potential benefits, I am sure that even with a converted camera you can sell it without damaging the credibility. But this is for sure not an easy task.

This reply was deleted.