Precision Agriculture & plant activity

Under our specialty in image processing in favor of drones operators, we are pleased to share with you an experimentation conducted on an agricultural parcel.

This experimentation should also test the robustness of the results obtained with two modified maxmax sensors (a R-G-NIR sensor for a classic NDVI calculation & a NIR-G-B sensor for a modified NDVI calculation). Accordance with what is announced maxmax on its website, the results are similar.

Processing chain:

1-Conducting a NDVI map with a modified formula using NIR, GREEN & BLUE channels.

2-Segmentation of micro plots with Arcgis and implementation of a classification attribute table: number, location, surface and plant activity.

Vector: Drone hexacopter

Camera: CANON SX260HS

Orthomosaic: resolution 5 cm

Date: 28 march 2014


You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –


  • Hi,

    Can anyone help me determining where the R/G/B bands are mapped in my Canon SX260 camera modified by MaxMax.



  • Hi John,

    thank you.
    We only work with aerial imagery that is why we haven't done any testing on the ground.

    The interest is to scan large areas in a short time.

    We abandoned the dualconfiguration because it is much easier and especially more effective with a turn-key multispectral camera (in this case: not need to make a correlation between R and PIR channels).

    Please find below a result obtained with a dual configuration (the result is interesting but we had too many problems with the "seam lines", especially at low altitude).

    mapping of plant activity NDVI | DRONES IMAGING
    • Thanks - this is an impressive mosaic. I agree that the dual camera configuration is a lot more involved and issues arise when you have images that are not perfectly registered, I mainly meant that it can be useful for comparing lens and WB settings on single-camera setups. Since we know that we have one pure IR channel and three pure visible channels, the lack of band contamination in a pixel provides the best reference for comparison.

      In the end what Daniel says resonates: whatever approach we use we need to walk at least part of the field and ground-truth what our images are telling us against the vitality of the plants we look at up close.

  • Thank you for sharing these results. I'd encourage you to retry a comparison between your two single-camera setups with a dual-camera setup as this is the gold standard. You don't have to collect aerial imagery to do this, you can do this on the ground with a tripod. If you look at some of the research on Public Lab you'll see that they usually compare single camera setups with a dual camera setup, using static ground-based imagery. The seamlines won't be an issue if you use a fixed tripod.

  • Hello,

    I use a plane with dual camera setup :

    Canon SX260HS - RGB

    Canon SX260HS - NDVI converted by

    Please what software do you use to calculate the actual NDVI or ENDVI ?

    Why do you say that dual camera setup is bad ? What are seamlines ?

    What are your experiences in France with the farmers or agronomists ? Are they willing to pay for this service ? For instance in Slovakia the agriculture is based on EU grants and the farmers do not care how much they produce.. They have limits for how much funds they can spend per hectare and that's all that matters for them. Making a profit here is something strange.. The whole EU funding of agriculture seems ill oriented.. How can a drone company offer a service/product in such environment ?? Maybe in the US or Canada people have different view and options.. All UAV market predictions I've seen say that 90-95% of the commercial UAV use will be in precision agriculture. But is this based on some fantasy or reality ?

    • Hi Michal, I think he means the 'seamlines' are the borders of the images where there has been imperfect registration between the NIR and RGB images, where two cameras are used. If you use Ned's plugin for Fiji, you can ensure that the resulting images are cropped, but I find that seamlines can still crop up. I therefore discard images with seamlines before I stitch and ensure I have enough images that some are redundant.

    • Hello Michal,

      I live in Brazil, and I am developing a drone for precision agriculture. I'd like to know more about your dual-camera setup. How do you merge both pictures?



      • Hi Eric,

        I do not merge anything. I use normal RGB camera and second one is converted NGB by NGB is then converted using ImageJ to NDVI or ENDVI. That's all. I use RGB also because I do aerial archaeology and precision agriculture too. I try to find some customers among vineyards now. Because agriculture is not interested in this technology here in my country. Maybe time has not come for them to realize the potential.. Maybe later..

        Also when you show people who don't know anything about Remote Sensing an RGB image, they are more likely to accept it because all colours are true coming from visible light (human eye restriction). When you show them NDVI false color image that comes from NIR-G-B, people can get puzzled easily. So they need explanation..


  • Hi Darius,

    yes these are test plots.

    We wanted to classify the micro-plots with values ​​of chlorophyll activity.
    And thus provide data for agronomists.

    We have not walked in the fields to check our results (against the engineers will do). The fact that I get the same results on two different cameras and two different methods of calculation have reassured me! In a second time, I think I'll study the vegetation on the ground. This seems very important.

    Good luck for learning french. You did not choose the easiest!

    Bye for now!

  • Hi Darius,

    With the NIR-G-B camera, we obtained higher values ​​than the R-G-NIR camera. However, the most active and least active areas are detected with both methods. If you work in relative values​​, I think both methods are good but we just did one test to check about maxmax.

    I recommend you work with R-G-NIR camera to better differentiate the vegetation/soil. The best approach is to personally test the two cameras (this is expensive but it helps to have an extra sensor in case the first one fails).

    PS/Vegetation type is soft winter wheat ant it was photographed one month before grain-filling stages.

This reply was deleted.


DIY Robocars via Twitter
RT @a1k0n: Did I get rid of hand-tuned parameters? Yes. Am I still hand-tuning more parameters? Also yes. I have a few knobs to address the…
3 hours ago
DIY Robocars via Twitter
RT @a1k0n: I'm not going to spoil it, but (after charging the battery) this works way better than it has any right to. The car is now faste…
3 hours ago
DIY Robocars via Twitter
RT @a1k0n: Decided to just see what happens if I run the sim-trained neural net on the car, with some safety rails around max throttle slew…
8 hours ago
DIY Robocars via Twitter
DIY Robocars via Twitter
RT @SmallpixelCar: @a1k0n @diyrobocars I learned from this. This is my speed profile. Looks like I am too conservative on the right side of…
DIY Robocars via Twitter
RT @a1k0n: @SmallpixelCar @diyrobocars Dot color is speed; brighter is faster. Yeah, it has less room to explore in the tighter part, and t…
DIY Robocars via Twitter
RT @a1k0n: I'm gonna try to do proper offline reinforcement learning for @diyrobocars and throw away all my manual parameter tuning for the…
DIY Robocars via Twitter
RT @circuitlaunch: DIY Robocars & Brazilian BBQ - Sat 10/1. Our track combines hairpin curves with an intersection for max danger. Take tha…
DIY Robocars via Twitter
RT @SmallpixelCar: Had an great test today on @RAMS_RC_Club track. However the car starts to drift at 40mph. Some experts recommended to ch…
Sep 11
DIY Robocars via Twitter
RT @gclue_akira: 世界最速 チームtamiyaのaiカー
Sep 10
DIY Robocars via Twitter
RT @DanielChiaJH: Always a good time working on my @diyrobocars car at @circuitlaunch. Still got some work to do if I’m to beat @a1k0n howe…
Sep 10
DIY Robocars via Twitter
RT @SmallpixelCar: My new speed profile for @RAMS_RC_Club track
Sep 10
DIY Robocars via Twitter
RT @SmallpixelCar: Practiced at @RAMS_RC_Club today with my new @ARRMARC car
Aug 28
DIY Robocars via Twitter
Aug 24
DIY Robocars via Twitter
RT @gclue_akira: 柏の葉で走行させてるjetracerの中身 #instantNeRF #jetracer
Jul 4
DIY Robocars via Twitter
Cool web-based self-driving simulator. Click save when the AI does the right thing
Jul 4