As we know there has been a lot of discussion on this subject and no clear winner from what I can see.
We are looking at the micasense red edge sensor, but it is a huge capital outlay for several units, so naturally apprehensive.
We have been playing around with a sample field for a farmer just using a straight forward RGB camera but just upped the temperature of the image...
Sample one taken 14/3/2015
Then another shot of the same field on 12/4/2015
Then another on 16/7/2015
Interesting results, obviously the multicopter camera angle changed, but the farmer was able to determine the areas that were poor and was able to get a rough idea of how much crop he could obtain from this field as the dimensions are known.
So is there software out there that will calculate the areas of the image which we can calibrate to give us a yield from the imagery without too much messing about?
And is NDVI really necessary when we get very similar results from straightforward RGB imagery?