Precision Agriculture & plant activity

Under our specialty in image processing in favor of drones operators, we are pleased to share with you an experimentation conducted on an agricultural parcel.

This experimentation should also test the robustness of the results obtained with two modified maxmax sensors (a R-G-NIR sensor for a classic NDVI calculation & a NIR-G-B sensor for a modified NDVI calculation). Accordance with what is announced maxmax on its website, the results are similar.

http://www.dronesimaging.com/plant-activity/

Processing chain:

1-Conducting a NDVI map with a modified formula using NIR, GREEN & BLUE channels.

2-Segmentation of micro plots with Arcgis and implementation of a classification attribute table: number, location, surface and plant activity.

Vector: Drone hexacopter

Camera: CANON SX260HS

Orthomosaic: resolution 5 cm

Date: 28 march 2014

3691109759?profile=original

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –

Replies

  • Hi,

    Can anyone help me determining where the R/G/B bands are mapped in my Canon SX260 camera modified by MaxMax.

    cheers,

    Mitko

  • Hi John,

    thank you.
    We only work with aerial imagery that is why we haven't done any testing on the ground.

    The interest is to scan large areas in a short time.

    We abandoned the dualconfiguration because it is much easier and especially more effective with a turn-key multispectral camera (in this case: not need to make a correlation between R and PIR channels).

    Please find below a result obtained with a dual configuration (the result is interesting but we had too many problems with the "seam lines", especially at low altitude).

    http://www.dronesimaging.com/en/mapping-of-plant-activity-ndvi/

    mapping of plant activity NDVI | DRONES IMAGING
    • Thanks - this is an impressive mosaic. I agree that the dual camera configuration is a lot more involved and issues arise when you have images that are not perfectly registered, I mainly meant that it can be useful for comparing lens and WB settings on single-camera setups. Since we know that we have one pure IR channel and three pure visible channels, the lack of band contamination in a pixel provides the best reference for comparison.

      In the end what Daniel says resonates: whatever approach we use we need to walk at least part of the field and ground-truth what our images are telling us against the vitality of the plants we look at up close.

  • Thank you for sharing these results. I'd encourage you to retry a comparison between your two single-camera setups with a dual-camera setup as this is the gold standard. You don't have to collect aerial imagery to do this, you can do this on the ground with a tripod. If you look at some of the research on Public Lab you'll see that they usually compare single camera setups with a dual camera setup, using static ground-based imagery. The seamlines won't be an issue if you use a fixed tripod.

  • Hello,

    I use a plane with dual camera setup :

    Canon SX260HS - RGB

    Canon SX260HS - NDVI converted by maxmax.com

    Please what software do you use to calculate the actual NDVI or ENDVI ?

    Why do you say that dual camera setup is bad ? What are seamlines ?

    What are your experiences in France with the farmers or agronomists ? Are they willing to pay for this service ? For instance in Slovakia the agriculture is based on EU grants and the farmers do not care how much they produce.. They have limits for how much funds they can spend per hectare and that's all that matters for them. Making a profit here is something strange.. The whole EU funding of agriculture seems ill oriented.. How can a drone company offer a service/product in such environment ?? Maybe in the US or Canada people have different view and options.. All UAV market predictions I've seen say that 90-95% of the commercial UAV use will be in precision agriculture. But is this based on some fantasy or reality ?

    • Hi Michal, I think he means the 'seamlines' are the borders of the images where there has been imperfect registration between the NIR and RGB images, where two cameras are used. If you use Ned's plugin for Fiji, you can ensure that the resulting images are cropped, but I find that seamlines can still crop up. I therefore discard images with seamlines before I stitch and ensure I have enough images that some are redundant.

    • Hello Michal,

      I live in Brazil, and I am developing a drone for precision agriculture. I'd like to know more about your dual-camera setup. How do you merge both pictures?

      Thanks

      Eric

      • Hi Eric,

        I do not merge anything. I use normal RGB camera and second one is converted NGB by maxmax.com. NGB is then converted using ImageJ to NDVI or ENDVI. That's all. I use RGB also because I do aerial archaeology and precision agriculture too. I try to find some customers among vineyards now. Because agriculture is not interested in this technology here in my country. Maybe time has not come for them to realize the potential.. Maybe later..

        Also when you show people who don't know anything about Remote Sensing an RGB image, they are more likely to accept it because all colours are true coming from visible light (human eye restriction). When you show them NDVI false color image that comes from NIR-G-B, people can get puzzled easily. So they need explanation..

        M.

  • Hi Darius,

    yes these are test plots.

    We wanted to classify the micro-plots with values ​​of chlorophyll activity.
    And thus provide data for agronomists.

    We have not walked in the fields to check our results (against the engineers will do). The fact that I get the same results on two different cameras and two different methods of calculation have reassured me! In a second time, I think I'll study the vegetation on the ground. This seems very important.

    Good luck for learning french. You did not choose the easiest!

    Bye for now!

  • Hi Darius,

    With the NIR-G-B camera, we obtained higher values ​​than the R-G-NIR camera. However, the most active and least active areas are detected with both methods. If you work in relative values​​, I think both methods are good but we just did one test to check about maxmax.

    I recommend you work with R-G-NIR camera to better differentiate the vegetation/soil. The best approach is to personally test the two cameras (this is expensive but it helps to have an extra sensor in case the first one fails).

    PS/Vegetation type is soft winter wheat ant it was photographed one month before grain-filling stages.

This reply was deleted.

Activity

DIY Robocars via Twitter
RT @chr1sa: Just a week to go before our next @DIYRobocars race at @circuitlaunch, complete with famous Brazilian BBQ. It's free, fun for k…
Saturday
DIY Robocars via Twitter
How to use the new @donkey_car graphical UI to edit driving data for better training https://www.youtube.com/watch?v=J5-zHNeNebQ
Nov 28
DIY Robocars via Twitter
RT @SmallpixelCar: Wrote a program to find the light positions at @circuitlaunch. Here is the hypothesis of the light locations updating ba…
Nov 26
DIY Robocars via Twitter
RT @SmallpixelCar: Broke my @HokuyoUsa Lidar today. Luckily the non-cone localization, based on @a1k0n LightSLAM idea, works. It will help…
Nov 25
DIY Robocars via Twitter
@gclue_akira CC @NVIDIAEmbedded
Nov 23
DIY Robocars via Twitter
RT @luxonis: OAK-D PoE Autonomous Vehicle (Courtesy of zonyl in our Discord: https://discord.gg/EPsZHkg9Nx) https://t.co/PNDewvJdrb
Nov 23
DIY Robocars via Twitter
RT @f1tenth: It is getting dark and rainy on the F1TENTH racetrack in the @LGSVLSimulator. Testing out the new flood lights for the racetra…
Nov 23
DIY Robocars via Twitter
RT @JoeSpeeds: Live Now! Alex of @IndyAChallenge winning @TU_Muenchen team talking about their racing strategy and open source @OpenRobotic…
Nov 20
DIY Robocars via Twitter
RT @DAVGtech: Live NOW! Alexander Wischnewski of Indy Autonomous Challenge winning TUM team talking racing @diyrobocars @Heavy02011 @Ottawa…
Nov 20
DIY Robocars via Twitter
Incredible training performance with Donkeycar https://www.youtube.com/watch?v=9yy7ASttw04
Nov 9
DIY Robocars via Twitter
RT @JoeSpeeds: Sat Nov 6 Virtual DonkeyCar (and other cars, too) Race. So bring any car? @diyrobocars @IndyAChallenge https://t.co/nZQTff5…
Oct 31
DIY Robocars via Twitter
RT @JoeSpeeds: @chr1sa awesomely scary to see in person as our $1M robot almost clipped the walls as it spun at 140mph. But it was also awe…
Oct 29
DIY Robocars via Twitter
RT @chr1sa: Hey, @a1k0n's amazing "localize by the ceiling lights" @diyrobocars made @hackaday! It's consistently been the fastest in our…
Oct 25
DIY Robocars via Twitter
RT @IMS: It’s only fitting that @BostonDynamics Spot is waving the green flag for today’s @IndyAChallenge! Watch LIVE 👉 https://t.co/NtKnO…
Oct 23
DIY Robocars via Twitter
RT @IndyAChallenge: Congratulations to @TU_Muenchen the winners of the historic @IndyAChallenge and $1M. The first autonomous racecar comp…
Oct 23
DIY Robocars via Twitter
RT @JoeSpeeds: 🏎@TU_Muenchen #ROS 2 @EclipseCyclone #DDS #Zenoh 137mph. Saturday 10am EDT @IndyAChallenge @Twitch http://indyautonomouschallenge.com/stream
Oct 23
More…