NDVI Capture & Processing - How Did You Do That?!

I wanted to share my experience with capturing and processing NDVI imagery for agricultural health indexing.  A preview of the end result:

snagit7.jpg?width=750

First let me list some of the hardware being used so you don't have to scroll through this post to find it all!

Equipment:

3DR Iris+

Canon SX260HS (Modified with Event38 Near Infra-Red Filter)

Camera Mount from IMP Concepts

CHDK installed on the Canon Camera (Instructions)

IMG_1590.JPG?width=750

Background:

I have wanted to get into crop health in a more serious manner for some time now, so I ordered a Canon SX260 and installed the NIR filter from Event38.  There is a great write up on how to do this modification yourself HERE.  I also went through the process of installing the Canon Hack Development Kit (CHDK) which includes a critical script called the 'Intervalometer' which lets you set the camera to automatically take photos every X number of seconds.

Note:  You can also use a cable directly from the flight controller on the drone to the camera and let the mission "trigger" the taking of the photos.  I didn't do that, but I know others have been successful with getting that to work. 

So, I am ready to plan a mission, capture some data, and then post process into some beautiful NDVI images and hopefully give some valuable agricultural information back to the owner/grower.  

Mission Planning:

For this, I simply used Mission Planner.  I think everyone is already familiar with this but in case you aren't, Get It Here.  The great part of Mission Planner was the ability to draw an area of interest, select my desired flight altitude and camera platform, and let Mission Planner generate a flight survey for me!  Sure there may need to be some adjustments, but its a really great starting point!

Create a Polygon:

3689693798?profile=original

Open Auto Waypoint > Survey Tool:

3689693588?profile=original

Select options for the Survey:

3689693692?profile=original

This lets me adjust the overlap, look at my camera footprints, and a few other really useful options for making a mission plan that fit my needs.

Flying!:

The next step was to pick a day with cooperative weather and actually go fly!  So I planned the mission(s) for collection to target a 100m altitude and was set up for the camera to capture images every 3 seconds.  These turned out to be good settings all around and I probably wont change too much about this for future collections.  

Everything worked great and after all of my collection flights were over (I flew 4 total flights to collect over 114 acres), it was time to get to work processing the imagery.

Data Processing:

That term sounds so boring, but in fact this was probably the most interesting part!  I decided to use the online service which is provided by Event38 called the Drone Data Management System (DDMS).  If you haven't used this, I can highly recommend it.  It worked great.....not perfect.....but still pretty great!

3689693812?profile=original

I can upload my missions (basically point to all of my collected images) and tell the service to process my imagery to provide results like Orthomosaics or even the NDVI imagery or 3D .obj files which can be used for modeling the actual terrain.  If you want to learn more, just check out the website, but I found the service to work great.

CONS:

  • It sometimes took several hours to process my data and give me a link to download my completed Orthomosaic
  • Limits each "Mission" to only 200 images at a time (so I had to split my images into several "Missions")

PROS:

  • It is FREE!  Well, at least you can do some level of processing each month for free right now
  • It is really easy and the interface is intuitive
  • Help is very responsive!  I had an issue with one mission, asked a question through the online portal and had an answer emailed to me in minutes

Note:  I have also used the Pix4D Mapper Pro product (and really like it as well!) and as a comparison, the results from Pix4D generally lined up better (more on that in a minute), but for just Orthomosaic generation, I couldn't see too much of an advantage there.  Of course there are some really great tools built into Pix4D Mapper to adjust Ground Control Points and some other features, so this comparison is really just focused on the generation of an Orthomosaic product from supplied images.  

Data Adjustments & Refinements:

Of course the first thing I wanted to do with my Orthoimage was overlay it into my visualization tool of choice (everyone's will likely be different).  What I immediately noticed was that my imagery didn't quite line up with the underlying Bing Maps or Google Maps base layer.  

So, I brought the Orthoimage into QGIS to make some adjustments and also apply my own NDVI false coloring.  If you aren't familiar with QGIS, it is basically an open source Geospatial Tool Kit.  Find out more about it and/or download it HERE.

The GeoReferencer Plugin was useful for me to make any adjustments to the alignment of the image:

3689693601?profile=original

 And then applying the False Color scale to reflect the NDVI calculation was done with the Raster Calculator as seen here:

3689693863?profile=original

This is basically calculator that allows me to take information about the image in all the collected bands and apply a formula to it.  In this case it is the traditional NDVI formula:

(NIR - Visible)/(NIR + Visible)

For my camera and it's modification, this was (band1 - band3) / (band1 + band3) as seen above, because the removal of the IR filter and the installation of the filter from Event38 allows the NIR light to be captured in the "red" band and the filtered visible light to be captured in the "blue" band of the traditional RGB (band1, band2, band3) sensor on the camera.

 I can apply that raster calculation and output the resulting image.   That resulting image will only be "singleband" at that point and I can then choose how to color the values to reflect my needs.

You can see the result here!

3689693907?profile=original

And of course a zoomed in view of a portion of the image:

snagit8.jpg?width=750

And some comparisons between the base imagery, raw camera output, and the NDVI results:

snagit9.jpg?width=750

snagit10.jpg?width=750

snagit11.jpg?width=750

This particular field is growing corn and it is early in the season so the plants were probably less than 6 inches or so tall.  So I expected to get more red/black colors as it was mostly still soil with small rows of corn showing up as more yellow.  I will survey again as the season progresses!

I am very pleased with the results and the workflow wasn't as difficult and time-consuming as I had originally anticipated.  There were a few details that can easily trip you up, but for the most part it was a relatively straight-forward activity.

I hope this helps others looking into the same type of activities!

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • I'm glad you like DDMS! We're on the verge of eliminating queue times for new missions plus a few alignment/geotag features are in the pipeline that it sounds like you'll appreciate. We'll update the mailing list when those things roll out.

  • If you search the term "Flight Riot FIJI" on google you will find the Image Processing Software called FIJI with a special plug in called Photo Monitoring (http://flightriot.com/imagej-fiji/). You can open the infrared mosaic and process the data with several different LUT (Look Up Table - Colors) and even do some strech between the Max and Min values to increase the diferentiation.

    Flightriot.com
  • This was a little over 114 acres. I was flying at the regular waypoint speed for the Iris+. I was thinking of bumping that parameter up a bit though. I ended up with probably more images than I really needed, but I figured it was better to have more than not enough!

    This was flown a few weeks back so I'm sure it's almost time to fly again! I'll update if the next survey produces some interesting results.
  • I'm in Canada doing the same thing.  I've just flown some wheat and canola.  Probably come back in early July to catch the wheat as the heads come up.  I'm surprised the corn isn't further along.  How long does it take to come up after planting?  Are these your crops?

    My biggest issue is getting the Canon ELPH 110 (MaxMax converted) to trigger fast enough.  I was capturing the JPG and RAW at the same time, but I think it's taking too long to write to disk.  CHDK is a bit confusing with the camera parameters, the CHDK parameters, and the script (KAP) parameters and understanding what plays what role. I'm flying between 12 m/s and 20 m/s depending on the wind direction.  How many acres did you cover?

  • It's pretty early to tell since the crops are still sprouting but there are some areas that look to be doing "better" than others if the grower was looking to minimize the need for nitrogen application early on, but I think the real value will be in the coming weeks to identify trouble areas which could be pest issues, diseased crops, or other issues.

    My goal with this survey was to get the processes and procedures ironed out more than to extract any information to be used for decision making.
  • Could you elaborate on anything you learned from the imagery?  In what ways was it useful?

This reply was deleted.