Precision Agriculture & plant activity

Under our specialty in image processing in favor of drones operators, we are pleased to share with you an experimentation conducted on an agricultural parcel.

This experimentation should also test the robustness of the results obtained with two modified maxmax sensors (a R-G-NIR sensor for a classic NDVI calculation & a NIR-G-B sensor for a modified NDVI calculation). Accordance with what is announced maxmax on its website, the results are similar.

http://www.dronesimaging.com/plant-activity/

Processing chain:

1-Conducting a NDVI map with a modified formula using NIR, GREEN & BLUE channels.

2-Segmentation of micro plots with Arcgis and implementation of a classification attribute table: number, location, surface and plant activity.

Vector: Drone hexacopter

Camera: CANON SX260HS

Orthomosaic: resolution 5 cm

Date: 28 march 2014

3691109759?profile=original

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –

Replies

  • Hi,

    Can anyone help me determining where the R/G/B bands are mapped in my Canon SX260 camera modified by MaxMax.

    cheers,

    Mitko

  • Hi John,

    thank you.
    We only work with aerial imagery that is why we haven't done any testing on the ground.

    The interest is to scan large areas in a short time.

    We abandoned the dualconfiguration because it is much easier and especially more effective with a turn-key multispectral camera (in this case: not need to make a correlation between R and PIR channels).

    Please find below a result obtained with a dual configuration (the result is interesting but we had too many problems with the "seam lines", especially at low altitude).

    http://www.dronesimaging.com/en/mapping-of-plant-activity-ndvi/

    mapping of plant activity NDVI | DRONES IMAGING
    • Thanks - this is an impressive mosaic. I agree that the dual camera configuration is a lot more involved and issues arise when you have images that are not perfectly registered, I mainly meant that it can be useful for comparing lens and WB settings on single-camera setups. Since we know that we have one pure IR channel and three pure visible channels, the lack of band contamination in a pixel provides the best reference for comparison.

      In the end what Daniel says resonates: whatever approach we use we need to walk at least part of the field and ground-truth what our images are telling us against the vitality of the plants we look at up close.

  • Thank you for sharing these results. I'd encourage you to retry a comparison between your two single-camera setups with a dual-camera setup as this is the gold standard. You don't have to collect aerial imagery to do this, you can do this on the ground with a tripod. If you look at some of the research on Public Lab you'll see that they usually compare single camera setups with a dual camera setup, using static ground-based imagery. The seamlines won't be an issue if you use a fixed tripod.

  • Hello,

    I use a plane with dual camera setup :

    Canon SX260HS - RGB

    Canon SX260HS - NDVI converted by maxmax.com

    Please what software do you use to calculate the actual NDVI or ENDVI ?

    Why do you say that dual camera setup is bad ? What are seamlines ?

    What are your experiences in France with the farmers or agronomists ? Are they willing to pay for this service ? For instance in Slovakia the agriculture is based on EU grants and the farmers do not care how much they produce.. They have limits for how much funds they can spend per hectare and that's all that matters for them. Making a profit here is something strange.. The whole EU funding of agriculture seems ill oriented.. How can a drone company offer a service/product in such environment ?? Maybe in the US or Canada people have different view and options.. All UAV market predictions I've seen say that 90-95% of the commercial UAV use will be in precision agriculture. But is this based on some fantasy or reality ?

    • Hi Michal, I think he means the 'seamlines' are the borders of the images where there has been imperfect registration between the NIR and RGB images, where two cameras are used. If you use Ned's plugin for Fiji, you can ensure that the resulting images are cropped, but I find that seamlines can still crop up. I therefore discard images with seamlines before I stitch and ensure I have enough images that some are redundant.

    • Hello Michal,

      I live in Brazil, and I am developing a drone for precision agriculture. I'd like to know more about your dual-camera setup. How do you merge both pictures?

      Thanks

      Eric

      • Hi Eric,

        I do not merge anything. I use normal RGB camera and second one is converted NGB by maxmax.com. NGB is then converted using ImageJ to NDVI or ENDVI. That's all. I use RGB also because I do aerial archaeology and precision agriculture too. I try to find some customers among vineyards now. Because agriculture is not interested in this technology here in my country. Maybe time has not come for them to realize the potential.. Maybe later..

        Also when you show people who don't know anything about Remote Sensing an RGB image, they are more likely to accept it because all colours are true coming from visible light (human eye restriction). When you show them NDVI false color image that comes from NIR-G-B, people can get puzzled easily. So they need explanation..

        M.

  • Hi Darius,

    yes these are test plots.

    We wanted to classify the micro-plots with values ​​of chlorophyll activity.
    And thus provide data for agronomists.

    We have not walked in the fields to check our results (against the engineers will do). The fact that I get the same results on two different cameras and two different methods of calculation have reassured me! In a second time, I think I'll study the vegetation on the ground. This seems very important.

    Good luck for learning french. You did not choose the easiest!

    Bye for now!

  • Hi Darius,

    With the NIR-G-B camera, we obtained higher values ​​than the R-G-NIR camera. However, the most active and least active areas are detected with both methods. If you work in relative values​​, I think both methods are good but we just did one test to check about maxmax.

    I recommend you work with R-G-NIR camera to better differentiate the vegetation/soil. The best approach is to personally test the two cameras (this is expensive but it helps to have an extra sensor in case the first one fails).

    PS/Vegetation type is soft winter wheat ant it was photographed one month before grain-filling stages.

This reply was deleted.

Activity

DIY Drones via Twitter
RT @chr1sa: Somehow I missed this, but last week @Nvidia released a new version of its Isaac robot development framework. Lots of new Lidar…
8 hours ago
DIY Robocars via Twitter
RT @Smartphone89: DonkeyCar Racing League Korea #donkeycar @diyrobocars https://t.co/mDlHDR1d98
Friday
Jørn Ramnæs liked VIDEO LINK's profile
Friday
Mark Harrison left a comment on Moderators
"Wow, Ning2 looks really great on mobile. I had sort of drifted away as my browse-time went more on my phone, so this is great."
Friday
Hank Deucker left a comment on PIXHAWK
"I am sure most will say not again, Pixhawk telemetry. I do need help and all can be reassured that I have spent many hours scouring the net for useful information. I am not new at quadcopter flying, I started in 2012 and have progressed through many…"
Thursday
DIY Robocars via Twitter
https://t.co/rLmFr8M2UK
Thursday
Hank Deucker left a comment on PIXHAWK
"Tomorrow, I am thinking about the wording at the moment."
Thursday
DIY Drones via Twitter
Welcome to the new DIY Drones design!. You may have noticed that DIY Drones looks a little different today. That's… https://t.co/cg2EmeDGMQ
Wednesday
DIY Drones via Twitter
New! Online Training on Drones for Disaster Response https://t.co/brqRuujjY2
Tuesday
DIY Robocars via Twitter
https://t.co/8EMCvYW6oz
May 25
DIY Robocars via Twitter
RT @OttawaAVGroup: We are going to try something new this week. We will be having a drop-in style meeting Wednesday @ 7-9pm EST. Pop in and…
May 25
DIY Robocars via Twitter
RT @tawnkramer: Thanks Everyone for another fun virtual race! Amazing Ai racers competing head-to-head, Virtual Makers Fair event. Catch us…
May 23
DIY Robocars via Twitter
RT @make: Find out the latest from @diyrobocars at 10am PT from @chr1sa of @3DRobotics on Virtually Make Faire! #makerfaire #maketogether #…
May 23
DIY Robocars via Twitter
RT @Heavy02011: Join us Saturday online: Virtual Race League: @DIYRobocars Race #2 - Parking Lot Nerds https://t.co/aJUu4sdutx #Meetup via…
May 22
DIY Robocars via Twitter
RT @donkey_car: Yay, we are part of the virtual @makerfaire with @diyrobocars. Come see our race or better yet train a model and race with…
May 21
DIY Robocars via Twitter
RT @chr1sa: The @DIYRobocars Virtual AI Race is going to be part of the virtual @makerfaire on Sat! https://t.co/wnMDAboLID
May 20
More…