Commercial use of drones in farms and other agriculture

1097 Members
Join Us!

You need to be a member of diydrones to add comments!

Join diydrones

Comments are closed.

Comments

  • Hello,

    I have a question about this 'NDVI' camera. I am wondering how a human looks when photographed with the modified camera. Will he or she stand out more from the background (suppose it is in a forest or other place with lots of green plants) than when a normal photo is taken?

  • One of the things we have to accept is that the "NDVI" we can get from cheap cameras is not directly comparable to the traditional NDVI. However, once you recognize that we are dealing with a completely new data type, we can start to explore the advantages and disadvantages without being encumbered by a wish for it to be more like a traditional NDVI.

    We mostly use cameras that see broad, overlapping bands in the blue and green spectrum, and a slightly narrower but still relatively broad band that spans the red edge with a peak sensitivity around 715nm, and is separated from the blue-green by a notch filter. We most often use the blue and red (actually red/NIR) bands to produce what we call a "blue NDVI". This vegetation index can produce very useful relative data, but it is not possible to use it as a direct stand-in for a traditional NDVI. The advantage is that we are getting extremely high resolution data. We use cameras that can shoot in RAW mode, and we use grey panels to calibrate images for comparison over time. The reflectance data is inherently more noisy than a traditional NDVI image, but in relative terms it still provides useful results. A big advantages, however, is that you can use the high resolution to get additional information related to plant structure, so you don't have to rely on reflectance data only to gain information about plants. It is a new type of data in many respects, with lots of potential, but we need to do much more research to understand how to use it.

  • Hi Dries,I don't have these filters yet and it will take a while. I did download some of the existing material on the website though and noticed how difficult it would be to turn this into material that is usable for historical comparisons, or comparisons between different sites.

    do you know of a good article which explains how this works for fertilization monitoring?  I think specifically for such purposes, the calculation only works if some of the variables are calibrated specific to the site.

  • Hi Gerard,

    yes these cameras indeed can give good relative information and for some applications (drought stress) this might be enough. For other applications (fertilization with N, P, K..) the farmer probably wants to know how much to apply. Do you have some airborne images to share?

  • Hi Dries,

    Sounds like you know what you're talking about.

    I've been checking out the results of the DIY camera myself. in my view, the camera is great if you want to look at "relative" data, given a certain lighting condition and how some area lights up provides information about how one plant is doing against another.

    In terms of getting absolute data, that is a much bigger challenge indeed requiring better equipment and more specific filters or configurations. The basic issue is that the DIY camera doesn't have a 'grounded reference'.

    What this means is that this kind of camera is great to get an overview of how a bit of vegetation is doing in one place at one time (possibly further challenged if the scene is very much alike), but historical comparisons or even comparisons between one place and another are very challenging. 

    Other than that, for the price of these things... it's a great step forward, isn't it?  What would be a leap forward is a method to somehow ground these efforts on an absolute reference. Then the differences in measurements are only offset by the quality of the actual equipment.

  • Just wrote a comment on cheap NDVI converted cameras. Maybe also of interest here:

    Although I like the initiative to move forward in such DIY camera design for vegetation monitoring, there is a big difference of what comes out of these cameras and high-end earth observation sensors like LANDSAT, SPOT VEGETATION or others. To name a few:

    LANDSAT -OLI is a multispectral sensor with 9 bands in the visual, NIR and SWIR domain. Those bands, as defined by their central wavelength and FWHM, were chosen specifically to meet a broad range of applications (vegetation monitoring being one) and to cope with the difficult task of atmospheric correction.  The INFRAGRAM or other home-brewed sensors, has only 3 bands, a very broad green, blue and “NIR” band, which were not chosen specifically for vegetation monitoring, but merely to what is available in the local camera store.

    LANDSATs NIR band (5) is defined from 0.85 - 0.88 µm, right on top of the NIR spectral plateau where we can expect maximal scattering due to internal vegetation cell structure. The NIR band of the INFRAGRAM is defined by the camera’s sensitivity around 720nm (which is blocked in normal camera) and the transparency at the same wavelength of the “super blue filter” they put in front. This 720nm is right in the middle between low reflection characteristic in the red (due to pigment absorption) and high reflection at the NIR (due to cell structure scattering). This is what we call the “red edge”. Hyperspectral sensors use the position of this red edge to derive more information about vegetation health status, but you need at least a measurement around 800nm to derive meaningfull data. If you want to see a spectral respons of a healty vs infected leaves, click here and check the wavelengths!!

    http://openi.nlm.nih.gov/detailedresult.php?img=3274483_1746-4811-8...

    LANDSAT is a pushbroom sensor calibrated in an integrating sphere to relate digital numbers (DN) to a value of real physical meaning, ie radiance values. After modelling the scene illumination conditions and atmospheric contributions they are able to derive reflection values, which are object specific and not related to external factors.  The INFRAGRAM (frame) camera gives you plain DN. They indicate to “calibrate” the camera before you take-off, but illumination conditions are surely not identical on the ground vs in the air.

    That leads us to the “NDVI” calculations: NIR-RED/NIR+RED. With NIR the spectral reflection value (not DN around 720!) around 800nm and RED the spectral reflection value around 650nm (so not a DN derived from a BLUE band in case of the infragram camera). As such I would not call the infragram vegetation index NDVI, maybe DIYVI?

    More information, go and take a peek here or contact me via PM

    http://hyperspectral.vgt.vito.be/content/vitos-hyperspectral-research

     

    Again, like the project,  but use the data with care!

    Dries

  • Hi Keeyen Pang ,  Great to know your from Malaysia . I am in Jakarta . We are developing some heavy duty UAVs here , Here's a video what we do .
    http://youtu.be/TU47z6GR_d0, I'd love to disccuss and share ideas with you . seems like you are very actively involved in RC& UAVs . perhaps we can even team up to do something here commercially . We are into crop inspection , and general surveillance crafts.  PM me at ktan73@gmail.com  . my no 6282114748455 ,  Rgds, Ken Tan

  • 3D Robotics

    3692704522?profile=originalFrom today's Wall Street Journal: Excerpts

    Farmers are starting to investigate the use of drones for a decidedly nonmilitary purpose: monitoring crops and spraying pesticides.

    Oregon State University plans to use the unmanned vehicles to monitor the school's potato crop and those of a commercial potato grower. Both crops, located near Hermiston, Ore., are expected to sprout in coming weeks. The university last month ran its first test-flight.

    Oregon State is one of several universities that have begun research projects to investigate the use of the unmanned aerial vehicles in agriculture. 

    Growers can run analytics on data generated by sensors and drones to quickly find problems such as specific plants not getting enough water. Flown by a pilot on the ground, aircraft equipped with infrared cameras can take a close look at the health of plants to help growers determine whether they need water, are suffering from insect infestation or need additional fertilizer.

    Today, this type of monitoring is done by manned planes and sometimes satellites. "The biggest problem in the past with aerial imagery in agriculture is that everything is time-sensitive, and with unmanned aerial vehicles we'd be able to process that data much more quickly," said Steve Cubbage, president of Prime Meridian, a company that sells precision agriculture data services to growers. Instead of scheduling a plane, which could take longer owing to the time required to make arrangements with the local airport, drones would be lower-cost and able to operate even if there was cloud cover, he said.

    The drones, some as small as eight pounds, can be put into the air on-demand, but because of the complexity of flying them, farmers will likely hire a company to provide the service.

    Oregon State will use two types of unmanned aerial vehicles that will be equipped with cameras. The HawkEye from Tetracam Inc. of Chatsworth, Calif., is a rectangular frame with a specialized video camera, a motor and propeller that weighs eight pounds and is attached to a large parachute. The other aircraft, called Unicorn fromLockheed Martin Corp.'s LMT +0.26% Procerus Technologies, looks like a glider. Mr. Hamm doesn't think these should be called drones because farmers will use them solely for monitoring their own crops and not for military or law-enforcement purposes.

    Read the rest there

  • Eric T was asking about how images are stitched together.

    I don't know how keeyen pang et al do it, but the process I think is quite simple and the s/w can be developed easily from open source.

    I've experimented with "visual mapping" for ground robotics. Basically the robot develops a map from images it's seen along its route and then connects them using simple "strong statistical features" that are extracted from each frame.

    I've used the SURF routines from OpenCV to essentially find the centre of one image in the next consective image and just overlay them on a common centre.

    The result is good enough. See my youtube vids

    https://www.youtube.com/watch?v=N56b2Z2HQlA&list=UUgiPZQ6TCWDC2A...

    or

    https://www.youtube.com/watch?v=DiKV1JxLwOk&list=UUgiPZQ6TCWDC2A...

    Wait for the overlays to pop up -- they're in there somewhere.

  • Hi All !!
    Someone could say that IR camera types are best run for precision agriculture issues with UAV??

    Regards and thanks!

This reply was deleted.

Hexa with Hobbywing X8 motors

Hii am building a hexacopter with1.) hobbywing x8 motors2.) flight controller pixhawk PX53.) battery 12s 22Ah4.) wheel base of around 1800mm5.) estimated weight around 17kg without payloadi am looking if anyone has already tuned this type so i can get a head start. looking forwardthank youMike

Read more…
2 Replies · Reply by Mike Almart Oct 24, 2022

What are the impacts of agricultural drones on the agricultural development of various countries?

Agricultural drones have gone from being questioned by the market to being actively promoted by the government, from tens of thousands to thousands of dollars. Behind this is the rise of technology in the entire industry. However, the degree of promotion of agricultural drones varies greatly in different regions. What do you think of the application of agricultural drones in your country?https://youtu.be/TDTW9TsvjZw EFT agricultrul drone solution

Read more…
0 Replies

Open Source Drone Payload Project Supporting In-Flight AI

I'm leading a project to develop a completely open source system for deploying artificial intelligence and computer vision on a drone. We are trying to make the system as general purpose as possible, but one application would be supporting precision agricultural. The real-time, in-flight processing aspects of the payload are probably less interesting for precision AG, but the data collection and curation capabilities might be. We are getting started with a few specific missions, we are…

Read more…
0 Replies