Hello everyone. 
  
Watching the sUAS Expo in San Francisco really got me thinking... we are at the beginning of a HUGE trend.    
  
The consensus is that the number one application for Drone technology over the next few years will be Agriculture. More specifically: aerial mapping, NDVI vegetation analysis and multi spectral imaging.  
  
This will most likely be the sector that will break social stigmas currently attached to 'drones', turning them from deadly killing machines into friendly farming tools comfortingly hovering over wheat fields. 
 
  
Over the past couple of years I have seen incredible advances in 1). Affordable Open Source Drone Technology (pixhawk) 2). Affordable Sensor Technology (you can get light high resolution cameras off the shelf) and 3). Image Processing Software.  
  
The problem I see is that for your average user, making sense of these individual products is daunting, even for those with more experience, integrating everything for efficient work flow is a major challenge.  
  
As I see it there are 3 key components that need to be integrated:  
  
 

  •     A GPS Enabled Drone  
  •     Camera Sensor (e.g. the Sony s110)  
  •     Image Processing Software  

How it should work:  
  
The end goal is to produce valuable data in the form of images. These images help Farmer Joe make his next decision. The key is to get the hardware to talk to each other through seamless integration.  
  
This is how I see the work-flow going...  
  
1 – A survey grid is drawn using the flight planning software overlayed on Google maps.  
 
 
2 – The drone takes off and moves to it's first waypoint in the survey grid via GPS signal.  
 
 
3 – The done reaches it's waypoint and triggers the camera to take a picture of the area underneath.  
 
 
4 – The onboard encrypted telemetry sends the image down to a laptop.  
 
 
5 – The image processing software on the laptop starts stitching the images into a mosaic.  
 
 
6 – This process repeats until the entire survey grid is completed, the drone lands and the stitched mosaic is automatically uploaded to the web-based application where the customer can analyze or share the image data in a 'Google earth' type interface.  

3691123435?profile=original
  
What we have:  
  
1 - We have a fantastic open source GPS enabled flight controller that is very powerful and fully programmable, thanks to 3DRobotics. The hardware only costs $279 with a GPS module and can already trigger cameras to take images and pre-defined waypoints.  
  
This flight controller works with the powerful open source 'mission planner' which can do the first three points above. You can program missions using GPS waypoints and even generate survey grids.  
 
2 - We also have low cost but high resolution cameras such as Sony cameras that offer remote API functionality: https://developer.sony.com/2013/11/29/how-to-develop-an-app-using-t...  
  
3 - Finally we have access to free image stitching software or Image Composite Editors (ICE): http://research.microsoft.com/en-us/um/redmond/groups/ivm/ICE/  
  
What needs to be done:  
  
All the components are there, we just need to get them talking to each other. This means designing a web based program that translates their languages into one user friendly application.  
  
The key being an emphasis on clean powerful yet simple work flow.  
  
I can imagine a user having a log-in account where he can click on one of his fields and this brings up a Google earth type  interface with the multi-spectral or NDVI  images overlayed on top.  
  
Take a look here to see what I mean: http://demo.terravion.com/#blocks 
  
That particular company uses full scale aircraft however and are orders of magnitude more expensive.  
  
What I really like is that you can select what overlay you want on the field be it infrared, NDVI or visual...  
 
The closest I have seen to this idea is San Francisco based drone-deploy: http://dronedeploy.com/ 
 
I am sure many of us are also looking forward to 3DR robotics releasing "drone-share" although I could not find much info on that... 
 
Ideally this should be integrated directly into the mission planner software. If we can get the telemetry to transmit live photos to the groundstation which can in turn perform real time image stitching and upload it to your account in the cloud for customers, then that would be a really powerful tool. 
  
So I am looking for any input you have to offer, any current projects or programs that you think I should take a look at. What is the best way to start the development process? 
  
This is a huge project with many challenges but even more potential. I am eager to hear what you think.
  
Best regards,  
  
Jethro Hazelhurst.

You need to be a member of diydrones to add comments!

Join diydrones

Email me when people reply –

Replies

  • Hold your horses!

    Current farm machinery used for spraying and spreading have very limited variable controls, in the case of spreaders, its normally on / off with a few having Variable Rate Control but no swath control. A few sprayers have section control which is normally in 2 meter sections, this is used to prevent overlap (spraying the same crop twice) or spraying dry earth on headland turns, here is a computer simulation by John Deere Variable Rate Applications, or see a video here for a real world view.

    Sprayers do not have individual nozel control, i.e. the ability to control the flow or mixture,some machines are able to vary the rate across the whole boom, such as 40 / 60 or 100%, see here on page 4. But some booms are 36m (over 100 feet).

    3701742535?profile=originalThis is a Househam sprayer in Oxfordshire, with a 30m boom, I took this image with a Canon s90 at 100m AGL.

    The chemical application is mixed in the farm yard, so the end result is block spraying, this is normlly 50m blocks, here is a PH application map for spreading lime;

    3701742430?profile=originalSo having satellite data of 30m / 20m / 15m or 10m resolution is fine for crop health.

    The market for UAVs within agriculture is filling the Remote Sensing gap when its cloudy, crop scouting (detecting weeds), producing reports on crop trials, monitoring wild life habitats which the farmers in the EU get a subsidy for called "set aside". Producing 3D maps for drainage and detecting or monitoring gulley formations.

    One thing here for Jethro, a guy I know called Stan who runs the Acclaimed Software Company in Norwich, England has a product called Jethro, which controls crop spraying machinery, see here Jethro.

    • @Keith GearySprayers do not have individual nozel control, i.e. the ability to control the flow or mixture,some machines are able to vary the rate across the whole boom, such as 40 / 60 or 100%, see here on page 4. But some booms are 36m (over 100 feet).

      Well, there is ready to go single nozzle control system here called Smart Nozzle: http://www.h-agtec.com/product.html

      Have tested one for over a season on a demo farm in Poland and the result was impressing. It is just a matter of the investment over profit as this is quite an expensive stuff. But I belive it gives what is is paid for with ease.

      I think the hardest point here is to make someone aware that adding a real time section control with 2-3m sections results in money savings as well as less chemistry output to the crops. The solutions are ready to implement. 

       

    • Hi Piotr,

      How many Smart Nozzle systems have been sold?

      My comment above, was in respect to the current general market, here in the UK the National Register of Spray Operators control the methods used to apply applications and certify the operators. To my knowledge there is no one using individual nozle control and I'm working on a contract for Bayer Crop Science at the moment, and no one there has mentioned it.

      Next month we have the annual Cereals show here in the UK, so if there is some new technology coming onto the market that has been certified by NRoSO it will be demonstrated there. If it is, I will let you know via this forum.

      Best Regards,

      Keith,

    • Hello Keth,

      I am the owner of Harrison Ag Technologies, the company that designed and sells the Smart Nozzle sytem.   The  Smart Nozzle system is an individual nozzle control system for sprayers.  We are also partnered with Graham Command to provide individual row control for planters.  So far we have sold 20 system in Europe; England, Germany, Austria.  We are also in Brazil, Japan, and the USA.  Our dealer in the UK is Altek International in Brigg.

      Our electronics use an Android tablet for the user interface and embedded controllers for the real time control of the nozzles/rows and CAN communications.  The systems provide both rate and overlap control at each nozzle/row.    

      I found this blog during a web search on drones and was suprised to see a reference to the Smart Nozzle system.  I am very interested in what is happening with drone technology and will be frequently monitoring this blog.

      Arial imagery is a very big topic for me, and agriculture in general.  Over the years the increable cost savings from just overlap control has made farmers aware of potential cost redutions by using these technologes.  Harrison Ag has one of the few products than take farming decision down the the square foot resolution.

  • @ Mustafa, please correct me if I am wrong, but I watched a talk on this and they said a satellite can provide approx 50cm/pixel, where as sUASs can give much finer detail up to 1cm/pixel. You also don't have to wait on the cloud cover as Keith mentioned. 
     
    Not so long ago there were a few companies that already provided sUAS Autopilots, until the APM devs and 3DRobotics provided an open source solution and look at where we are now. If we can tackle this problem with the same momentum then I think we will not only have fantastic hardware and code, but an entire ecosystem (as I learned from Chris's talk at the expo), and it is that 'ecosystem' that will take the APM project to the next level. 
     
    Like Android, it is the ecosystem that is valuable to the project. 
     
        Generating orthomaps is not as easy as stitching in ICE. You can only decently stitch orthorectified photos, and generating those requires some hefty programming power and more items in the toolchain. In my opinion, well tuned 3-axis brushless gimbals will go a long way to almost negating perspective variation in nadir photos, especially from slower flying multicopters. But you'll still need to orthorectify to get usable photomaps, it just means your success rate will be much higher. 
     
    Do you recommend georeferenced images that are tagged with their GPS location, thus requiring less processing power? I remember a talk on this also, but it requires very accurate GPS positioning, something that the LEAH GPS module does not have yet. 
     
    I agree that there are incredible challenges, but if the APM project is anything to go by, it's doable! 
     
    On a different note, I am very keen to start something but I am unfortunately still a programming newbie. I do however have experience running an open source project (this one is for 3D metal printing) here: http://www.metalbot.org/ 
     
    So, if anyone can recommend a programming language that would be suited to developing this app, please let me know. 
     
    So I am keen to get something going and can set-up a website, development forums and a wiki. 
     
    Best regards, 
     
    Jethro.
    • @ Keith Geary, thank you.

      @ Jethro: "Do you recommend georeferenced images that are tagged with their GPS location, thus requiring less processing power? I remember a talk on this also, but it requires very accurate GPS positioning, something that the LEAH GPS module does not have yet. "

      Yes georeferencing always adds value, very few of the apps don't make use of it and now even VisualSFM allows GPS-based transforms. Its also useful in the image matching stage, especially if you have a large collection, and this is where a lot of processing time is used up. I see that Dronemapper now doesn't accept images that aren't geotagged. Also, injecting GPS and altitude into images in post has never been easier with apps like ExifTool, GeoSetter and GpicSync.

      I am still testing my PixHawk and up to now I've only used my MikroKopters for mapping, but the GPS data they provide is adequate for geotagging. They use the Ublox LEA 6S module. If Navi performance is good enough to allow a drone to loiter pretty much in the same place, give or take several inches (in the absence of wind), then the GPS data in the log must also be good enough surely? I can get that sort of performance with my MKs and I also got it with my APM 2.6 quad when last I flew it. I have no doubt I'll get it with a PixHawk-based multi too.

      Regarding the satellite issue there is no doubt that they can offer a valuable service to agriculture and to the extent that they capture the market we drone operators may find it hard going. However you are are correct that they cannot currently match our spatial resolution potential and its unlikely that they ever will. There are just hard limits to processing and transmitting that much data. If they decrease their spatial coverage to increase resolution, they are able to service less areas, and so they hit diminishing returns.

      To give you some examples of high resolution systems, IKONOS has a 1m/pixel resolution for visible light and 4m/pixel for multispectral, but a swath band of only 13km. It has a revisit time of 11 days. Improving on this, the Worldview-1, a more recent satellite, has approximately twice the resolution of IKONOS, a revisit time of only 1.7 days but a swath width of only 16.5m. The medium resolution systems, such as SPOT-4, have much wider swath bands (thousands of km's) but massive resolutions such as 1km/pixel.

      So we will definitely win the high resolution war, but its not all just about this. Some farmers will be more than happy with low resolution, especially large scale operations such as maize and sugar cane. We will likely dominate wine farms, grove-based farms and boutique farms such flower exporters, but we have an additional advantage. Satellites generate imagery by line-scanning, not from photographs. Although this generates consistent nadir views in the imagery, it also means that reflectance is also consistent. On the other hand, we can take many images of the same point from a wide variety of angles, and composite an image. This means every pixel is a composite of say, 20 others. The NDVI you calculate from this pixel is therefore nicely averaged for reflectance, something that we can offer and satellites can't. (It does require we use fast cameras and fly our drones quite slowly though). If you also calibrated the individual images to a fixed standard, in advance of compositing them, you can now offer reliable NDVI data.

      So its getting interesting.

  • Several companies already provide services delivering weekly agricultural crop and crop condition information based on optical satllite data, the most widely developed in Europe include Farmstar, GeoSys, Talking Fields and Fieldlook.

    Farmstar is owned and backed by Astrium now part of EADS, they started in France and now operate in Ukraine, Brazil and the USA. They tried to enter the UK but were defeated by the weather (cloud cover). As you can see our weather today and for the forseable future is cloudy with rain and a bit of sunshine.

    3701742329?profile=originalEO satellites only pass every few days or even weeks, the new Sentienl 2 satellites will pass every 5 days, so the chances of it being cloud free when the satellite over passes at this time of year are very slim.

  • The only thing holds me back about agriculture is that new satellites. I have made many meetings with many people and agency in my country(Turkiye/Turkey). They all thinking to send a satellite to space just for agriculture. This satellites will have required hardware for precision agriculture. So if i were you i would think twice to say " the number one application for Drone technology over the next few years will be Agriculture"

  • I'm already working in this area and here are some tough issues:

    • Generating orthomaps is not as easy as stitching in ICE. You can only decently stitch orthorectified photos, and generating those requires some hefty programming power and more items in the toolchain. In my opinion, well tuned 3-axis brushless gimbals will go a long way to almost negating perspective variation in nadir photos, especially from slower flying multicopters. But you'll still need to orthorectify to get usable photomaps, it just means your success rate will be much higher.
    • NDVI photomaps take the problem a step further. There are various issues, but one of them is the issue of inter-period consistency and comparability. Our drones fly low enough to pick up lots of shadow, which especially a problem with crops like vineyards. You need to overmap an area many times to be able to build a composite image from which to calculate NDVI. Oh and this only solves the reflectance problem (mostly). How do you adjust for differential light intensity on different days? Farmers want to monitor changes so they can reverse-diagnose interventions and fine-tune them. Again, you need to do further post-processing to calibrate images. Some even feel that you need to calibrate each time using a hand-held chlorophyl meter.

    So all in all, I like your approach and your infographic is very helpful, but the discussion only starts here, there are a plethora of issues to deal with.

    • Well articulated John.

This reply was deleted.