Sequoia In The Wild!

bGHFbn5EdHTQGYGzUJbjZT-2ZpnFzjqX3MuYM40tw-eAQO-8pKc7A9d_gNSanBtwPAcsl9HBS5fx2l_fnlfxVcz7JuwwODdG-Hdwra2G2K3UnNB98tj_5L7wQ8AxQjrqOFjehSN_TUMAKJ3g1w

Well, not the wild, perhaps the most mapped place on the planet; the Berkeley Marina.

(Disclaimer: I work for Parrot. I did this on my own time, and for my own knowledge and confidence in the product. I was excited by the results, and thought this group would be as well.)

I know many of you here are like me, exploring the capabilities of sUAS vehicles, sensors and processing workflows. It is a quickly changing and dramatic landscape. The business side of drones is equitable to the “Wild West” with dramatic story-lines and pricing shoot-outs happening in the streets. Similarly, the technology side of drones is equitable to the steam engines of the same era. Modern engineers are quickly iterating and surpassing “legacy” technology whose life-span was measured in months, not years.

Enabled by this rapid pace of sensor innovation, Sequoia is a multi-spectral sensor with four discreet-bands, a separate RGB sensor and a fully integrated light incidence sensor including GPS and IMU packaged in a common physical form factor. When Parrot announced the Sequoia sensor at the World Ag Expo a few weeks ago, I still had many questions. “The consumer-product company, Parrot, really made this sensor?” was quickly followed by more useful questions like, “Does it really do what I think it does? What’s the processing workflow? How do I explain this to people?”

As part of my job at Parrot, I have early access to these sensors. While I’m generally a very optimistic person, I’m also a skeptic at heart. I know how difficult it is to engineer, manufacture, test and ship hardware/software products at volume. When a Sequoia came across my desk last week, I couldn’t pass on the opportunity to put it in the air and perform my own, fairly un-scientific, end-to-end validation test. I had very little to do with bringing this product to life, but I am excited by the quality and capabilities. Next time you see a Parrot engineer, give them a high-five.

W49AScNJxEj-r9tCRugj_mwgp5GJH4tZXObfcsSYtDR0--CzVQQG143REhpF8nCsiZhyKe8oj52QX73ovqFdEwTk07y2jK9x448EszxtpKqxP4nEIKId4SqeC5J9A9sy_k-Pvq_d7eAkUEVytGu0C-y6b9QKxjg2qvBsFHafulOjcqa9BFqvuEaGtfFZU-B2fUOfYcSO7_9_2aROOUHpkwDH3Ir13H7T7UpuQXJ72kYM1cDQL4OQ5cfy_Dw_Y9MKas0dHnNM6nyiEuhb

Sequoia was designed to be extremely easy to integrate into existing hardware configurations. The only connection the system needs is 5v3A. I added an appropriate BEC to my 3DR Y6, spliced in a female USB connector and the sensor was fully functional. I took a FR4 plate and cut it to fit around the lens array and the camera fit perfectly into the gimbal. The USB cables that the sensor comes with are a little bulky and stiff for this kind of installation, but with some cable routing the package was quickly ready to be put in the air.

Sequoia has an IMU in both the irradiance sensor and the camera itself. What does that mean? The compass dance. Fortunately, it’s painless and very similar to a BeBop2 in duration and complexity. It took about 20 seconds to get through yaw, pitch and roll for both sensors.

There is an SD Card expansion slot in the irradiance sensor. I put a 16GB card in there to make the image transfer super easy. With the amount of data this thing is collecting, it seems that the fastest SD cards you can buy would be a good choice.

In addition to USB PTP control, there is also a WiFi connection available in Sequoia. This allows you to access a browser based camera configuration tool where among many other things you can set time or distance based triggering, or telnet into a root prompt on the camera. Since you can also do this with many other Parrot products, I’m assuming it will also be available on the production version of the sensor. I’m going to keep checking back on http://developer.parrot.com/ to see when they post more information on the API.

Once powered up in the field, a green light on both the camera and irradiance sensor means that it’s ready to produce fully geo-tagged images. Below is a screenshot of what that means. I did zero configuration to this camera. After initialization, I pressed the shutter button twice to start time based capture at one second intervals (more on this later). As you can see, every time the sensor is triggered, it takes five pictures that are all geo-referenced.

qY1ttdDQnCnkXnhTnZDOAcrJVWf3XsQLOVF2TfvK9WD3vqyANTX0gh2f1dERtKVSHKLCOJ-5HZwLkC2wrsD5RzOnZtuYyO5qFKeLCEtpF4N5hv8bq-tfrpT2mT1wu-MI9lNgndCZ

One of the unique things about this sensor is the discrete bands that it captures simultaneously. The advantage is that the sensor can more precisely measure specific wavelengths than standard RGB sensors. When processing these wavelengths on the back-end, users are able to combine the discrete bands in different indices that best match their specific requirements.

wL9ZmFCvcsAnDxvOusguxRIiifDkAzbUz37Qzbvn7CjM03H9JCzOJhVo_4G360CybUIU9vwBm7HlQ5KSUxL4Gy4kpVL7T8qw-zLIIhj8FbGylxJ6lmgWngI_hFTBZ1DtIMdzTZLEdvbA5nDrhw

After validating that the sensor was working properly on the ground, I started up Mission Planner. I wasn’t sure of Field Of View of the sensor, so I took a guess and used the S110 camera model in the grid planning tool. At 50m planned altitude, this reasonable flight plan is what was generated. After preflight checks and starting the sensor, I sent the Y6 off on its mission.

mojUU0o8H-7Y6WtTUv17vXBJeeCjfVn9eGEwhZ7jxv5RViTk2JirzMURapowv-U3JqEZ9M9jVuYa6k0Nv-xFTBbhsNye8y8Gr_U2MlNjBFlH6ET_4tfDHPZxqmEsRRtLdCMBcKBV

The flight went smoothly, and the data was easy to validate in the field. I pulled the SD card from the irradiance sensor, put it in my laptop and made sure there were enough geo-tagged images. I was pleasantly surprised at the quality of the RGB images. Multi-rotors are an inherently bad place to be if you are trying to be precise. I was a little worried about rolling shutter or noise, but those worries were unfounded.

RGB: https://www.dropbox.com/s/pb16jzu5dbltz3y/IMG_160227_224926_0238_RGB.JPG?dl=0

Green: https://www.dropbox.com/s/lxbwa6xm9wz0tsv/IMG_160227_224926_0238_GRE.TIF?dl=0

Red Edge: https://www.dropbox.com/s/nd2rcuyshx924ip/IMG_160227_224926_0238_REG.TIF?dl=0

Red: https://www.dropbox.com/s/tpdcaxc9jxcapjl/IMG_160227_224926_0238_RED.TIF?dl=0

NIR: https://www.dropbox.com/s/17zx8k93srs7ujv/IMG_160227_224926_0238_NIR.TIF?dl=0

Speaking of enough images, a one second interval is clearly too frequent. On this flight, with 5 images taken every second, there are 1791 images for Pix4D to process. 90%+ overlap is unnecessary for this type of processing.

9ig2FwvF-9_res-gJ3a4Gh1Fcnx3kk31NFU9wup8DnhV6rZBDBbExIWqcd4-ezm2oqVvNJudxFCd_DSnSwxCOBghpChnrKBrr9Soe2x1OsKpACiePQLb56wxMn6QgwG35k62c84Y

The next flight will use distance based triggering to test proper overlap, but for now, I manually culled two out of three picture sets to get to a more reasonable 771 pictures. Keep in mind that this is total pictures including all discrete bands and RGB.

bwmqoG7COIzbIaNKxM-dTafsepF7HTBqrXBzf8ZioMUftg31Kjq451mpcjARRO8VYuXTElqYg6uuDZa8YaJcbE2RkPz9SIsozJLOhYCUPSNg78MYBZATYvQqKosuvcUsc55jbXUp

It’s still not pretty spacing, but we will see what Pix4D thinks of it. Using a beta Pix4D version 2.1.34, the Sequoia images are seamlessly identified and put into an appropriate Camera Rig. https://goo.gl/2Y3Epz

SFHob_01vA9F-rbZwlPGQnCx5cuau3fyloTjx59rnEhTgGi_pRuyXLP05YUJNadLoAohyiGgHuAWVtPTDNGrL8fv6RUvFrSJkWIbYUYWiw6qZhm07IsEPz3psrf9hEoXRSNNpdAo

I am not a Pix4D expert, and I wanted to see how the default configurations worked, so I selected the 3D Maps template and “Start Processing Now!”. Again, I’m more interested in the workflow and time required than fine tuning the knobs available in Pix4D.

iPkuiYDBDPNrGoeThc8eA312kd_P15Z5cM3p7GM9fXqvC837HXmWRqp6y_gyap49J0nv7IJrlP-fWY9iCS6xxp6RGW_oAtt9LzC_VGu2jFxEm4zGV1tbGB2UhpshUei9kHr19CSW

First things first, the quality report. https://db.tt/4kZm0mYA

SGEGHCEsh1ulUHA-wgDw634lkD2U2MjNlUDEmAWom1_IB2MNcvoij7tigh-l_KiCKmkvERcPWTxCjKEq5qGtfENtHezg5Mm9_JlwuFyJWVZeo3l7p9Hrq58ku6v_WvLrt3CnDrfd

Well, not the best looking quality report I’ve ever seen. I assume it’s primarily due to the “shotgun” style triggering and image culling. I like to see the first four checks be green here. The GSD is 5cm at 50M, which seems pretty good to me. Full initial processing took 5 hours, not too bad on a 8GB RAM machine with an Nvidia 970 GPU. I think we should see closer to ten thousand keypoints and I’m not sure why there are three image blocks, I know we want one. It seems to me we’re fairly close on all of these checks though, how did the results come out?

PRsdWLE1UAZeNSbGRNKRWn-UaGZemSdDJEw0JADmvOk76ouqJIVptK5LAtqNgxkdnkq21hqzklPhk5eJjHX9KmN4jpIvg6zw_k4P7PaTuZnaANYJGmsU3Q_M98gV0ziYe1IlIlfe

Coverage area is OK, the top right hand point is not accurate, but sidelap was apparently sufficient. Not a bad result for poor image collection technique and providing nearly zero input into the processing! I’m sure someone who has more experience in Pix4D could clean this up nicely.

Wfl-pl_nXCn1bPQkfudmZnSWyQP9gKbXSMzY_hE98DMxSweJ8hhOtjBlum6StFZ81tIctRZPe6INpdmEoMtpEKirkA-0jQm79ZWQOOLFfpWjCBlphwIT7hwd-ticH_SBIRGEeRFo

Zoom quality of the ortho also looks pretty good. Nice straight lines, good level of detail, not much ghosting. Here’s a link to the full ortho. https://www.dropbox.com/s/zx1ax2mc30b13fl/second_marina_geotag_transparent_mosaic_group1.tif?dl=0

Let’s do a quick sanity check on those aggregate piles.

TQIYzYoloYhJ6w75AKksltTGE-V43DAPJQrbgTUD4W1Uww2RAh_Vs732ZwA0vBkqG76m2vpRlKPd9WCMY5as9JTpfz9kqnElgpeX1eQCHd0l4f4n09cflMqBxm-WE4q-mRYh9vO9

The larger pile is around six cubic meters and the smaller around five. These smaller piles are a little easier to estimate by eye, and by my expert eye measurements, six and five seems pretty close!

Finally, it’s time to check out what Sequoia was designed to do, produce accurate multi-spectral imagery. Pix4D has done a great job on their Index Calculator, and the changes for Sequoia in this latest beta really help simplify the process of producing vegetative indices.

HcipO3hvQBZFB39nl54dha_QXEqpNkcEbB33QTE2DvVOYc6xVoLtoCa0Hv7US2zf7BHk5HyAgBNc0Da27r7M_c2ABOEt1Rx0Gez7Z0dIIOs0jNm1oLwqG1YfKtsZM98y3fPXc92f

After first generating the reflectance map, you then define the region to analyze, and then the index you want to process. In this case, we have the NDVI index. I think the data in the top right corner (red shading) is questionable, and should ideally be removed from the analysis. The region tool is helpful for excluding information that doesn’t need to be processed. There are many of you here who understand this better than I do, so I will not pretend to be able to interpret this map.

Finally, you can generate and export variable rate prescription shapefiles to import into your precision farming applicator of choice. For example, Ag Leader.

http://www.agleader.com/blog/loading-prescription-files-into-ag-leader-integra-and-versa-displays/

hDr5LK-KromwtaoeK3C5OqTHBXpUellNJNWol1BmMuMBPsO2eVJCe-_CpWDEeq5fIHpYUcuv8XjBiI_PRCUbCRbrCYj9fcg-dZsV34Sfa4tc6Peu6oTRScVrQRVPhzWooA-08eIu

An agronomist or other expert eye is still required to make sense of these vegetative indices in relation to individual crops, but there are companies like Airinov http://airinov.fr working to automate the analysis based on crop specific phenology.

We’ve come a long way in the past few years. While much of this work has been done for decades using satellites and manned aircraft, we are just now able to begin discussing viable end-to-end workflows for drones in the agricultural space.

It is a fact that this multi-spectral and aerial technology are merging in a way that can quickly and accurately produce vast amounts of data for one of the largest industries on the planet. Sequoia takes a great step in commoditizing complex data acquisition for agriculture. Gathering cost-effective, timely, high-resolution data is no longer the challenge it once was.

The challenge we must now overcome is listening to farmer and agronomist requirements on ease-of-use, interoperability and data life-cycle so we as a drone industry provide products and services that can build trust at scale in an industry that is far more experienced, confident in its roots, and wary of newcomers.





E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • Hello @John, @Nick

    Facing an issue with Sunshine Sensor LED not glowing. No idea if the GPS/GNSS and Sunshine sensor are working. 

    If anybody has faced a similar issue, please let me know about the solution to this.

    The same question I have added in my discussion. PLEASE RESPOND ABOUT THE SOLUTION.

  • Hi John,

    Great thread! Any answers to your last post above?

    Thanks...

  • @James - correct, it requires 5V at 2.4A I used a BEC.

    @Troy - I'm going to try and do the S100NIR next to Sequoia next week. I'm not sure how SenseFly is handling the eBee field upgrades. I'll ask.

    @Leo - There is someone working on the Pixhawk to Sequoia PTP bridge. I'm not sure of the current status.

    @Colin - These are the numbers I have from an engineer at Parrot. Thanks for setting up the group!

    Full Specs:

    The D-FOV (diagonal) are:

                Monochrome: 89.6°

                RGB: 73.5°

    - Monochrome:

    - Pixel size: 3.75µm

    - Focal length: 3.98mm

    - Resolution: 1280x960

    - Sensor size: 4.8 x 3.6mm

     

    - RGB:

    - Pixel size: 1.34 µm

    - Focal length: 4.88mm

    - Resolution: 4608x3456

    - Sensor size: 6174.72 µm x 4631.04 µm

     

  • Scratch that, found it.

    But what's the consensus on the monochrome focal length? 3.98mm as per manual or 3.02mm as mentioned above?

  • Hi guys, does anyone know the sensor size for the Sequoia? Need it for the Pix4D GSD Calculator.

    Facebook usergroup here for those interested:
    https://www.facebook.com/groups/parrot.sequoia/

    Parrot Sequoia Owners
    Support Group for owners and users of the Parrot Sequoia Multi spectral NDVI camera for UAV, RPAS, Drones
  • Hi John, we are planning to do precision drone mapping with on board RTK system, I wonder if it's possible to trigger the sequoia with relays from pixhawk, or get the output signal each time the sequoia taking a photo? thanks,

  • Hi,

    We fly a standard ebee, no the ag, and wondered if buying a sequoia is plug n play? Will a std usb work?

    Also, has anyon e compared the Canon S110 (pretty std cam for ebee) to the Sequoia RGB at 100m? I know its 16mp but mp is only 1/2 the story.

    Our other drone is almost finished with a Sony A5300 in mind but the sequoia might come first.

    Cheers

    Troy

  • @John C. How are you powering the sequoia? It looks likes its only powered via USB.

  • Great Study's 

  • We have successfully mounted a fully gimbal stabilized Parrot Sequoia camera on the DJI Matrice 100 (M100). The data is preprocessed and properly culled onboard with complete onboard image registration to output fully aligned imagery that is ready to upload. We have made it compatible with our Map Pilot app to properly manage overlap and handle the automated flight control. 

    Coming from a scientific and military multispectral imaging background, we really have nothing but nice things to say about the Parrot Sequoia camera. They did a really great job. The best part is that you can take data at the same location under different lighting conditions and get the same answer.

    Here is the data sample from our maiden voyage:

    https://www.mapsmadeeasy.com/maps/public/8f4055c82d0f4f11986b342b4f...

    https://www.youtube.com/watch?v=FN3qGC2fRGk

    Sequoia 4/1
    Mission Bay Flying Field at 80m
This reply was deleted.