Jeff Taylor's Posts (32)

Sort by

New Event 38 Mission Planner Features

The latest version of Mission Planner for Event 38 has just been released and we have some exciting new features to share. We often develop these items specifically for our aircraft but I like to hear back from the DIY Drones community what features, if any, you think would be useful to have. If anything looks interesting, let me know and we'll work to get it into the main Mission Planner development branch. As always, you can download our version as-is on our Downloads Page.

Preflight Checklist

The Preflight Checklist has been updated to help ensure that the proper mission and camera are selected prior to takeoff. After checking the box next to Valid Waypoints, Mission Planner will download the waypoints off the drone and display them on the Flight Data screen's map for inspection.

If your drone uses multiple different cameras, you can select the camera used for each flight and check the checkbox to upload the correct parameters to trigger the camera.

Inflight Monitoring

The new inflight monitoring section is an easy way to tell at a glance if there is anything that needs the operator's attention. By default, it checks for low altitude, excessive groundspeed, orientation, GPS satellites, excessive current draw, and servo voltage. You can access and adjust the parameters or add a new custom monitor by pressing ctrl+p. 

Perimeter Run

The edges of a mission tend to have the least overlap. In cases where it's difficult to overfly neighboring areas or where it's critical that the full mission area have maximum overlap, it can make sense to run a lap around the edges of the mission area taking pictures. We've programmed this route to be added automatically when you select the Perimeter Run checkbox in the Survey/Grid screen.

Tuning Averages

This one is mostly for internal and testing use, but if you want to keep a particularly close eye on certain parameters in flight, we've added a min, max and average readout to the tuning graph and added the ability to set the time scale. We use this for trend monitoring on production aircraft.

Read more…


The latest version of the Event 38 geotagging utility is now available for download and as always, it is available to the wider DIY Drones community. It’s compatible with standard MAVLink telemetry logs. If you’re using telemetry logs to geotag your images and find that the number of CAM messages doesn’t always match the number of images taken by your camera, this utility will help by automatically sorting out which cam messages were missed. The utility will skip tagging images for which there are no CAM messages available. We’ve found that missing a few tags here and there is no problem for most missions. Even missing a relatively large number does not necessarily ruin a mission as long as the geotags available are spread evenly throughout the mission area.


In the example image above, we missed a lot of geotags in the corner of the mission while the plane was very far from the ground station. Tagging these images 1:1 with the camera messages resulted in a large error as each misaligned tag built on the last. The new utility took care of it and the overall geotag error was just 4.2m. Grab the latest utility on our downloads page, and let us know what you think.

Read more…

Process Maps, NDVI and 3d Models for Free


With the Drone Data Management System™ officially out of beta testing, I thought it would be helpful to go through and show new users how they can still use it to process orthomosaics, NDVI maps and 3d models with imagery from any drone for free. This isn't intended to be a comprehensive guide to mapping techniques, but I'll go through the basics for the sake of those with no experience. If you already have some data you're ready to try, skip to the post-processing section below.


Data Collection

First of all, to make any map, you need to collect the right kind of imagery. For a drone, that means collecting images that point almost straight down and overlap at least 60% with its nearest neighbors front and back as well as side to side. In other words, if your camera has a field of view of 100 meters on the ground along the path of travel, it should move only 40 meters between shots. In general, 60% is an adequate overlap balancing coverage and avoiding excessive data collection, which slows down processing.

Planning for overlap and sidelap for Pixhawk vehicles is straightforward if you use the Mission Planner Survey (Grid) function. You can read more about that here: ArduCopter Wiki

DJI doesn't provide planning tools for surveying missions, but some third parties do offer these features for the Phantom 3, 4 and Inspire 1. The UgCS app, for example, lets you plan a mission on a PC and then load it onto your drone. UgCS is free for private use and supports automated survey mission planning with camera and drone presets for DJI vehicles.

If your drone doesn't have a gimbal, do your best to fix the camera pointing straight down. This might mean aiming a little forward to compensate for a multirotor's pitch in flight. Most drones in ordinary weather conditions should be able to maintain adequate overlap. DDMS compensates for small angles off vertical that may be in some images.

Once you've collected all the raw images, give them a quick look. Make sure they aren't blurry, don't have part of your drone in frame, and that they appear to have significant overlap. It's not necessary to geotag your imagery before processing, but there are advantages to doing so if you can. It speeds up processing, scales your map, reduces warping artifacts and makes it possible to take measurements from your orthomosaic. If your drone is running on Pixhawk and you've triggered your camera using the distance trigger function, DDMS can automatically geotag your images if you upload the telemetry log (.tlog) with your images. DJI drones automatically add geotags to each image.<br><br>



Uploading to DDMS is simple. First, sign up here using just an email address. Then log in and click Create Mission. Enter a description for your mission and select the images and supporting files (.tlog) from your flight. Then just click Upload. At this point, your job is done, leave your browser open on this page until it prints a message that all images have been uploaded successfully!

To add NDVI or a 3d model to your mission, open the Missions Page in a new browser tab, select your mission and click the Analyze Images button. Be sure to leave the upload page open until it finishes. On the Analysis page, you'll have the option to add NDVI and 3d Model generation. More apps are available with a paid membership.

Sample "Analyze Images" Page, Select Apps Here


Our NDVI process is optimized for NGB (NIR, Green, Blue) converted cameras such as our own custom filters/cameras and similar options like those from MaxMax. Some converted cameras, particularly the small, cheap ones, have serious problems with rolling shutter distortions. These distortions make stitching much more difficult as they aren't consistent like the distortions from a lens, so try to check out some sample imagery before choosing a camera.

DDMS will take all your images and automatically process them into a 10cm/pixel geotiff (higher resolution available with paid memberships). If geotags are included, the mosaic will be georeferenced. It also automatically tiles your map for viewing online using Map Viewer. You can download the raw geotiff and NDVI mosaic for offline viewing by clicking Access Downloads on the mission page. Often times these files are too large for ordinary picture viewing software to open. QGIS or GlobalMapper are good options for offline viewing of geotiffs, but it's almost always slower to work with these large files offline.

The Analyze Images page also shows you the status of each step of processing as it progresses. When each step reaches 'Ready' status, you can access the result by clicking directly on it. In Map Viewer, toggle between the ortho-mosaic and NDVI results by selecting the layer from the upper-right corner and adjusting the opacity slider.

Composite of two Map Viewer Windows, Raw NGB Orthomosaic on Left, NDVI on Right


DDMS also recreates a 3d model of each mission. Select the 3d Model app and download the files to explore using Sketchfab, Meshlab or another modeling package. Although the beta period is over, we are still actively seeking feedback from the community. We see a lot of users doing things we didn't expect them to do with drone maps and we want to keep encouraging that kind of experimentation!


You can read more about the Drone Data Management System™ here. We have more apps available for the Pro and Advanced tiers, like DVI, 3d PDF, DSM, KMZ, Volume Calculation and Point Cloud Exports. Let me know what you think, what would be useful for you?

See the original post on Droneyard.

Read more…

APM to CHDK Camera Link Tutorial

[Edit 12/31/2015: Note for Pixhawk users that Canon cameras require a 5V pulse to trigger so a voltage step-up from Pixhawk's 3.3V output is required. See compatible cable and link to more instructions here.]

ArduPilot Camera Tie-In Tutorial

This tutorial will show you how to get your CHDK-enabled camera connected to your ArduPilot Mega (APM) without buying any extra hardware or decoding PWM outputs. You might want to make this connection to take advantage of the camera trigger function based on distance covered, to ensure a certain overlap and to avoid excessively high overlap. This modification is not necessary, if you have a CHDK camera and want to keep it simple, you won’t gain much by switching to this method right away.

For this tutorial, I’ll use an SX260 HS but this should work equally well on any CHDK-supported Canon Powershot camera. You’ll need just the supplies below to make the cable. The SX260 uses a USB Mini-B connector. Most Powershot cameras come with a USB cable, you can just use that one to be safe.

APM CHDK Connection

If you don’t have a crimp tool to attach the servo connector, just take a spare servo wire and solder the wires together instead.

We’ll only be using the Mini-B side of this cable, so measure whatever length you need from that end depending on how your camera and autopilot mount in your airframe. I’ve measured out about 14″ and cut the cable completely through. Inside are four wires, we only need the red and black wire so cut away the green and white wires. Strip the ends of the red and black wires.


If you have the crimp tool, crimp and insert the wires into the first and third positions of a 3-position header. If you don’t have the crimp tool, solder these two wires to the ground and signal wires of a spare servo wire. Insert the ground wire into the side with an arrow so you can tell which wire is which later on.


Cover the connector with heatshrink. Be careful with the hot air near the connector as it can cause the locking plastic pieces in the receptacle to deform.

CHDK Cable

Now plug this cable into pin A9 on the side row of APM. The text doesn’t line up exactly so be sure to count rows. SPI takes two rows, then A11 and A10 are two more so there should be 4 rows of free pins, then our new connector.

APM Relay Pin A9

If you already have CHDK installed, just install the script file (right click, save link as) into the scripts directory on your SD card. If not, follow instructions for installing CHDK from the page corresponding to your camera atthe CHDK wiki. Load up CHDK and go into the menu then navigate to Miscellaneous Stuff -> Remote Parameters and make sure the Enable Remote setting is checked.


Now connect to your APM and open the Full Parameters List in the Config/Tuning tab. Set the parameters as following:

  • CAM_TRIGG_DIST: Depending on overlap required.. be sure to keep in mind the normal downwind speed of your plane as well as the maximum rate at which the camera can take pictures. For the SX260 and the E382 that’s once every 2.7 seconds and about 18m/s. I plan to operate below those condition in most cases but I do like to get as many pictures as possible, so I set my distance to 49 (meters).
  • RELAY_PIN: 13



All that’s left is to test it out. Load the E38_APM.bas script file like any normal CHDK script and start running it. If you can get a 3d fix inside, you can test it on your lab bench just using USB power. Just set the CAM_TRIGG_DIST to 1 or 2 meters and let your plane sit on the desk. Small movements between GPS readings will cause the distance value to count up slowly and it should trigger the shutter every few seconds.

If your camera lens closes and opens instead of taking a picture, make sure the CHDK remote enable setting saved properly. If nothing happens at all, take your APM outside and walk around to make sure you are getting some distance covered.

Original Post

Read more…

Introducing the E386 Mapping Drone


Our goal at Event 38 is to build products that make aerial data collection for mapping as easy as possible. For a while now, our team has been focused on creating a better autonomous landing solution. At the moment, recovery of a fixed-wing drone requires either a large area for autonomous landings or an operator comfortable with taking manual control.

The two biggest concerns with any autonomous landing solution are:

  1. Onboard sensors tend to drift over time. If there’s a barometric pressure change during a long flight, the drone will develop a discrepancy between its measured altitude and the true altitude above ground level.

  2. A descending drone picks up a lot of speed as it dives toward the ground, but slow speeds are desirable for safety and soft landings.

After much research, development and testing, our engineering team has built a solution that enables a drone to descend rapidly while minimizing forward speed and flaring precisely before impact for a soft landing every time. This makes autonomous landings possible in extremely confined areas, surrounded by tall trees or other obstacles.

Today I’m very excited to share with the DIY Drones Community that this capability is available in our new E386 mapping drone, available as of today! The E386 has a fully customized landing algorithm which incorporates precision laser altimeter data with the ability to fully reverse the motor’s thrust. See it in action below.

Descent gradients as high as 60% are possible when landing with a headwind. The overall size of clear space needed to land the E386 is just 65 x 25 meters. The E386 knows exactly where the ground is and exactly where it needs to flare, every time. Precision flaring not only protects the E386 body from impact damage, but it allows the E386 to touch down as soon as it achieves a safe speed.


The E386’s landing setup procedure is completely streamlined and only requires you to define the landing runway’s beginning and end points. Mission Planner then shows you where the E386 may touch down given various wind conditions, as well as its minimum projected altitude along its dive slope.


The E386 also comes complete with a full year subscription to the Drone Data Management System™ Professional Tier. The Drone Data Management System™ is a cloud-based set of tools that store, analyze and share data collected by any drone. DDMS™ automatically creates a geotagged orthomosaic and tiles large maps to be viewed quickly in Map Viewer, our online map tool. The entire processing workflow is automated from geotagging to DEM, NDVI, DVI and more analyses.

DIY Drones members who want to add reverse thrust landing to their planes will soon be able to! We're currently working on generalizing our dive algorithm to work with other aircraft. Once it’s ready, we’ll submit it to be included in the master ArduPlane codebase - special thanks to Tom Pittenger for his help on this! We'll also post a tutorial here about setting up the parameters as well as hardware and operational recommendations. We plan to continue to iterate on this technique to improve landing accuracy and to make the updates available to E386 and ArduPlane users as they become available.


Questions and feedback welcomed! Feel free to leave a message in the comments or to tweet me @mJeffTaylor.




Read more…

Drone Data Management System™ Public Beta


Since 2011, Event 38 has built a reputation as a business focused drone and sensor provider. We’ve been very fortunate to work with some of the most innovative drone operators in the world, and we’re really proud of what our customers have accomplished. There’s always been a missing piece of the puzzle though. The tools needed to geotag data, stitch images, calculate vegetation indices, and even just to explore the imagery have been a patchwork of disparate open source projects and expensive software packages. Today I’m very happy to announce the launch of the Event 38 Drone Data Management System’s™ public beta!

The Drone Data Management System™ is a cloud-based set of tools that store, analyze and share data collected by any drone.

DDMS streamlines the entire map post-processing workflow. Upload, Process, Analyze and Share. Easily upload the images you want to include with the flight’s telemetry log (.tlog) and DDMS will automatically tag your images, even if you had a few telemetry dropouts mid-flight. DDMS automatically creates a geotagged orthomosaic and tiles large maps to be viewed quickly in Map Viewer, our online map tool. DEMs, NDVI and DVI calculations can be added as well.

All processing results are also available as high resolution originals in geotiff format, so you can continue to work with your existing GIS tools. Anything that you process in DDMS can also be privately shared with friends, clients, colleagues and advisors. Permissions are set mission by mission, making it easy to share only what’s required.

The DDMS is still in the very early stages of development but we can’t build anything good in a vacuum. We need your feedback so we can focus on the important features and functionality! Head over to the DDMS Signup Page to get access right away.

While we’re in beta, the DDMS will be completely free to use! Don’t worry, we’ll give you plenty of warning before the beta is finished, and your data will always be yours to download. In the future, we’ll be adding time series analysis, crop stand counts, topographic map exports and ground control point editing.

We’d really appreciate your feedback! What else is important to you as a drone user? What’s missing from your current tool set? Let me know in the comments below, or Tweet me @mJeffTaylor



Read more…


For anyone using Mission Planner as a ground control station, check out these additions we've made to streamline MP to our workflow. We typically run mapping missions on fixed wing aircraft at Event 38, so these are tailored to that use, but some of our features are more general purpose. I'll outline the features we've added below and open up to comments about what you think should be merged into the master Mission Planner codebase. Let us know what you want!

Automated Pre-flight Checklist

We’re finally coming out of the dark ages, no more paper checklists! Mission Planner already has all telemetry data at its disposal, why not have it help check off setup criteria too? While the E384 boots up and self checks out each subsystem, the operator goes through the physical inspection and completes the checklist. Green means go!


Accessible Flight Planning Functions

To improve the speed and ease of planning a mapping mission, we’ve taken the most common functions from Mission Planner’s right click menu and made them more accessible by placing them in the right hand pane in the Flight Planning tab. Sometimes the most simple changes can be the most appreciated!


Cam Message Indicators

Mission Planner now leaves a trail of markers where each picture was taken, making it simple to tell when an area is not being covered as expected. If the aircraft is at a high pitch or bank angle when the picture is taken, the icon appears yellow to warn you that additional coverage in the area may be needed.


There are a few more updates, admittedly more specific to the E384 platform, but feel free to check out the full list here on Droneyard. We're in the process of joining the Dronecode foundation and will work with the dev teams to get these features integrated into the next Mission Planner release. In the meantime, feel free to check out our source code on github, or just go ahead and use our compiled version right away if you want (Click Mission Planner for Event 38 at the top of the page).

Read more…


We've been hard at work improving our popular NGB filter glass over the summer and the latest design is finally in! This design cuts off higher wavelength NIR light around 770nm. That reduces the amount of light crossing over to the Green and Blue channels which are also sensitive to NIR light, especially at higher wavelengths. The results are better than we could have expected! Check out my full post on Droneyard for more info:

Filters are available now, as are pre-converted and ready for flight SX260s and S100s with CHDK.

There are discounts available for resellers and OEMs, so please contact us before making an order!

And in addition to that, we're now offering the 20.3MP APS-C Samsung NX1100 as a regular option on the E384. The option includes a 20mm pancake lens and a hatch cover system to protect the lens for belly landings. More info on the NX1100 addition, including a sample mosaic, here.


Let me know what you think. We have lots of exciting projects in the pipeline to be announced shortly as well and we can't post everything here, so check out for the latest!

Read more…


Check this out if you're interested in adding thermal capability to your drone. The resolution is lower than FLIR, but the accuracy of the readings seem to be very high (0.2*F). From the Kickstarter Specs Section:

Product Specs:

  • 64x62 thermopile array with integrated optics. 
  • Best sensor resolution at this price at 0.61 degree angular resolution
  • No non-uniformity correction needed with thermopile technology 
  • Frame rate up to the ITAR-TASS regulations limit of 9 frames per second to any fully Bluetooth- or WiFi-capable device. 
  • Low power consumption and 850 mAhr battery provides up to 8 hours of continuous use without charging, or over a month if just using for 10 minutes daily. 
  • Android application for smartphone or tablet 
  • iOS application for iPhone and iPad 
  • Python & OpenCV application for windows & linux desktop 
  • App or button-driven laser pointer and online temperature display aligned to center of field of view with 0.2 deg Fahrenheit accuracy
  • Thermal measurement range: -50F to 450F 
Read more…

NDVI Post-Processing in Fiji


A lot of people have been asking me how we post-process imagery coming from one of our, or any other company's, NGB converted cameras. There's a very easy way to run this processing thanks to Fiji and the Photo Monitoring plugin by Ned Horning. I've just written up a quick getting started guide to this software, pasted here for convenience. If you're coming across this post in the future, see this link for the most up to date version.

Images taken with modified NGB cameras need to be processed in order to display information about vegetation health. This process is very easy using Fiji and Ned Horning's Photomonitoring Plugin. First, grab a copy of these software packages - the easiest way to get a copy with the Photomonitoring Plugin is to download a pre-configured pack from Flight Riot, here (Look for the text "Click Here to Download FIJI/IMAGEJ with PHOTO MONITORING PLUGIN pre-configured" just below the third paragraph).

Install this version of Fiji then load the program and open an NGB image. The sample image used in this tutorial is downloadable by clicking here.


Open the NDVI processing tool by clicking Single image NDVI from displayed image from the Photo Monitoring dropdown as shown above. Now the NDVI processing tool will open and display several options for how to process your image. Make the following changes to the default settings:

  1. Uncheck "Stretch the visible band before creating NDVI?"
  2. Uncheck "Stretch the NIR band before creating NDVI?"
  3. Change the output color table box to 'ndviClasses_-1_1.lut' as shown below


Now just click 'OK' and after a few seconds two images will load. One is the black and white raw NDVI values image and the other is the same image but with the selected lookup table applied. A lookup table (or LUT) simply takes the NDVI values from the first image and applies a color depending on the magnitude. This allows us to visualize the NDVI values more easily. This is just a way of visualizing the data though and does not change the data in any way, you can select other LUT files and experiment to see which display you prefer most. The default output results in the images below.


This is perfectly usable as it is, but I know this LUT is designed to display the highest values in green. The highest values in this image are only as high as values corresponding to yellow, which is about 0.5. In order to show more depth in the image, we can re-scale the LUT values from -1.0 to 0.5 so that we get the full range of colors across the full range of NDVI values in this image. To make this change, open the NDVI processing tool again and this time enter 0.5 into the box titled "Maximum NDVI value for scaling color NDVI image". Applying that change gives us the following images (note the black and white raw image hasn't changed at all):


In this version, we can see more levels of differentiation within the leaves, and even some different levels in the snow, likely caused by vignetting in the camera's lens. When making a mosaic, process the mosaic first before calculating the NDVI values as this gives the software a chance to promote image-center pixels over image-edge pixels (as most postprocessing software does). Aside from that, the NDVI image has lost a good deal of the detail that would have been used to match up overlapping images in the mosaic. Fiji can safely handle mosaics up to several dozen megabytes but above that it is less stable, even if you increase the maximum memory it allows itself.

Read more…

New, Higher Performance NDVI Converted Cameras


Late last year we released our first version NDVI converted cameras using Schott BG3 filters. Using those filters we were able to generate pseudo NDVI images with a single camera, building off the work of the Public Lab project. After working with faculty at the University of Boston and the MEASA Lab at KSU, we determined that we weren’t quite getting the best results possible with the BG3 filter glass. Unfortunately, no off the shelf filter glass seemed to do exactly what we needed, block out red light and allow NIR light to pass instead. Luckily, we were able to find a manufacturer to help us build a custom filter to our specifications, and the results are looking really good! Below are two images, the first taken by a camera with BG3 filter glass and the second taken by a camera with the new custom glass, the difference is clear.

BG3 Filter GlassSchott BG3 NDVI

Custom Filter GlassCustomFilter

Aside from higher overall values, there is more differentiation and detail in the leaves and a larger difference compared to the non-organic background material. This is especially important for this type of camera. Since these are uncalibrated, pseudo NDVI images, what is really needed most is high differentiation so that comparisons can be made between the majority of plants and potential problem areas. The higher detail available now makes it possible to catch unhealthy plants sooner.

Custom Filter Transmittance vs WavelengthFilter Transmittance


The new filters will be priced at $39.99 for the filter alone (8.9×7.9mm, fits SX260 and S100 at least, potentially others) and$499.90 for a ready to go converted SX260 with CHDK. It’s a little cold to be doing any crop surveys here in the Northeastern US right now, I actually ended up killing those two plants after taking them outside for just a few seconds! We’ll be out and collecting imagery as soon as anything starts growing and I’ll post updates as they become available.

Read more…

Bluetooth Telemetry Bridge - Last Chance


Just a friendly reminder, now is the last chance to grab a Bluetooth Telemetry Bridge before the Kickstarter ends tonight at 10:30PM EST. We've already had an overwhelming response and reached our goal in less than 24 hours! After tonight, there will be a bit of delay before we can take any more orders because we'll prioritize Kickstarter pre-orders.

Here is a link to the Kickstarter Page and some more information from my original post here on DIY Drones.

If you're not familiar with the Bluetooth Bridge, it's a simple device that allows you to connect to your 3DR Radio or RFD900 telemetry stream to any Android phone or tablet supporting a ground station. The most popular ground station for Android is the DroidPlanner. The device is very easy to use, just power it on using the internal rechargeable battery (recharge with a USB cable). The battery lasts for over 5 hours of continuous operation. Right now, the Bluetooth Bridge does not support iOS devices because Apple requires a special type of Bluetooth hardware to integrate properly.

The Bluetooth Bridge can also help for normal laptop groundstation work since it allows you to place the telemetry antenna in any location where it might have better reception.

Let me know if you have any questions or suggestions about the project in the comments below. Thanks again for your support!

Read more…

Bluetooth Telemetry Bridge Now on Kickstarter


Thanks to everyone for your feedback and suggestions on our last post about this project. We've decided to go ahead and have just launched a Kickstarter project for it today!3689561542?profile=original

We will offer both a 433 and 915MHz version of the Bluetooth Bridge, each is compatible with its counterpart 3DR Radio. The 915MHz version is also compatible with RFD900 radios. Range when paired with a 3DR radio is approximately 1km, when paired with an RFD900 the range will be approximately 4km. The Bluetooth Bridge has been designed to be as simple and easy to use as possible. The unit comes with an internal, rechargeable battery so you won't need to find an extra LiPo with the 'right' kind of connector. Charging is as simple as plugging the unit in to a PC or laptop with a Micro-USB cable. The unit also lets you know the status of the battery with a green, yellow or red LED.


The Kickstarter page is live now and is currently the only place to make an order. Once the Kickstarter is complete, we'll begin accepting orders through our web store. The price will be $139 which includes the Bluetooth Bridge module in a durable enclosure and an internal, rechargeable battery. You'll still need your old air side radio module, but if you don't have one we'll also be selling compatible packs for $189.

Check out more details on the Kickstarter page and leave us your comments/questions below!

Read more…

We've been experimenting more and more with tablet ground stations here at Event 38 and have had a lot of success using the DroidPlanner app. The software has been working well for us but we found our ground station setup was really cluttered, the opposite of what we were going for when switching to tablets. My main complaint was having the ground side telemetry radio dangling from my phone or tablet. Not only was it a hassle and prone to coming unplugged, it was also hard to keep it in a good position for radio reception.

Following Arthur Benemann's instructions, we set up a bluetooth bridge that would let us at least set the radio down and move freely with the ground station tablet. A huge improvement, no doubt, but still a little rough around the edges, even needing power from a laptop. So we added a nice case and a single cell LiPo battery to power it all. This worked even better, and looked pretty nice as well!

2013-10-16 15.36.09

It still had some power management issues but it worked well enough for our purposes. At this point we're thinking about running a Kickstarter campaign to run a production scale build of these but we'd like to get some feedback from potential users before we solidify the specs. I'd appreciate any/all comments on what kind of features you'd like to see on something like this, if any, besides the telemetry bridge to Bluetooth. Right now we're planning on including the following:

  • Internal LiPo battery, rechargeable by USB
  • Battery status LEDs
  • 433 and 915MHz variants to be compatible with any 3DR Radio
  • ~20m Bluetooth range
  • ~1km telemetry range
  • 5-6 hours battery life
  • A durable plastic enclosure

Is there anything else we should consider adding to improve the functionality or usability? Do you think this would be priced fairly at $139? Let us know what you think!

Read more…

DIY Camera Filter Swap on a Canon Powershot


I wrote a few weeks ago about testing out a Schott BG3 filter on an SX260 for detecting vegetation stress remotely. That post and the results of this conversion are posted here. We’re now selling both those filters individually as well as pre-modified cameras at Event 38 for those interested in testing them out, but be aware before you grab one that the processes related to setting the right white balance, calibrating imagery to compare results from different pictures and even post-processing are not yet finished. We’ll be working on improving these but I wanted to make available what we have right away because there are some people interested in getting started right away. The process shown below will work just as well for any other filter you get anywhere else.

Before you start, it’s good to have a good selection of screw drivers, a brush, an air puffer, a marker and tweezers close by.


To start, remove the back section of the camera by unscrewing 1 bottom, 2 right and 2 left side screws.




Once the back is loose, pull it off gently and remove the flex cable by sliding it upwards out of the socket.


Next, remove the bracket holding the right side of the LCD in place by removing the single screw at the top.


Now the LCD is loose. Slide it a bit to the right so it clears the lip on its left side and then pull it up on the left side. There is a flex cable in the upper right corner that can’t be pulled too hard or it will come out. Accessing it to replace it is very laborious so be careful not to put too much stress on that cable! I use masking tape to keep the LCD up and out of the way while working beneath it.


We’ll be unscrewing the 3 screws on the back of the sensor in a moment but first we need to remove the glue holding the sensor plate down. The glue is pretty tough, so I use a really sharp exacto knife to scrape it away bit by bit. Quick, light cuts work better and are less likely to damage something else than pushing hard so be gentle. Use a brush or air puffer periodically to remove the debris.


Once the glue is out of the way, mark each screw’s position using a marker before unscrewing them. Make sure to set the screws aside in a way that you know which position to return it to later.


Take the screws out and gently remove the sensor plate. Clean the area again to remove any dirt/dust that has fallen in. The reddish piece of glass in the center is the IR-block filter that normally keeps out all infrared light. Remove the rubber placeholder holding the filter in place.


Remove the filter glass carefully using tweezers and set it somewhere safe in case you need to put it back in later.


Clean the entire area again.


Now carefully drop in your filter glass.





Replace the rubber placeholder and then the sensor plate on top.


Replace the screws in the right order and screw them in until the marks match up as closely as possible.


Close the camera back up by replacing all the screws and brackets and don’t forget the flex cable on the back of the case. Finally you’ll want to check and make sure there’s no dust on the sensor or inside of the lens. Put the camera in Aperture priority mode and open up the aperture as wide as possible. Zoom the lens in all the way and focus the camera on something far away by half-pressing the shutter button. Now aim the camera at a clear sky or a white piece of paper and make sure it fills the entire screen. Take that picture and then look at it to make sure there are no black spots, below is the image as it came out of the camera used in this tutorial.


If your image came out clean too then you’re all set! If you’re using a Schott BG3 filter, a quick and easy way to start processing images for vegetation stress is to set the camera’s white balance to the cloudy preset. This will allow you to process a vegetation index that represents vegetative cover and growth, although the results don’t yet cover the full scale of values as they would using a traditional multispectral camera.

Read more…

Parachute Recovery Tests


I've been working for the last few weeks on developing a parachute recovery system for E382s to reduce the need for manual flying. I know from my own experience that any time a human is at the controls, the risk of crashing is significantly higher. Parachutes have been made for small aircraft before but they're often implemented in an expensive or complicated way. The idea here is to make a simple, working parachute that reduces the overall risk of damaging the aircraft, is simple to use and cheaper to install.

Building the parachute itself was pretty easy, I used this gore size calculator by Scott Bryce and just printed the gores out as patterns. I accidentally printed the pattern out on 8.5x11 instead of 11x17 so this parachute is actually only 22" diameter - much smaller than what it should be for this weight. The gores are cut out of black rip stop nylon and just sewn together. The lines are attached at 8 points, with the lines simply tied through button holes. The line I used is too thin to be sewn into the material but I prefer the thin line, made of kevlar, for its weight and size.


Next I ran a few tests using some dummy mass at about the same weight as an E382, 1.70 kg, to see how quickly it would fall and if it would even open up properly. The first few tests were done with the parachute already pretty much deployed as it was from a pretty low altitude off my building's fire escape.



I thought the speed it hit at was slow enough, despite the smaller chute size, to not significantly damage an airframe and decided to move on with the tests. So I attached the parachute to a light-weight (1.5kg) airframe to see how it would fall and if the wings would affect the speed significantly. Because of the smaller size of the parachute, the airframe was actually still had almost enough lift to fly itself. So as it fell and built up speed, it would begin to pull out of the dive, then the parachute would slow its horizontal movement and it would stall, then presumably repeat until it hit the ground. In a few of the tests, the parachute line also got caught under the horizontal stabilizer, further complicating the problem.


Not wanting to call it a day, I pressed on to simulate the actual deployment. Ideally, the parachute would deploy on its own, be near the center of gravity and allow the plane to fall down roughly level and impact the ground on its belly. That all points to a deployment from on top of the wings, right in front of the motor. On the first try, I loosely packed the parachute on top of the wing and just tossed it. This allowed it to catch wind almost immediately which promptly pushed it straight back into the tail, where it lodged itself, allowing the plane to crash at high speed.


But I tried again, packing the parachute more tightly so it would roll out past the tail in a compact ball before opening up and catching wind. That did the trick! The dive/stall issue is still there but that should be fixed with the next parachute build.



All four tests above plus one more live deployment test on a flying plane are in the video below.

Would you ever use a parachute as your primary recovery mechanism? Why or why not?

Read more…

DIY Samsung Trigger for APM


In this post I’ll describe how to make your own cable to trigger the Samsung mirrorless cameras NX20, NX210 and NX1000. We’ll be using an ArduPilot Mega 2.5 to do the triggering but you can use any Arduino or even a soldered push button.

What you’ll need:

Some USB devices use what’s known as OTG or On the Go cables to determine master/slave status of each device. The resistance between the ground pin and the ID pin determine how the device responds – most OTG cables just short these pins which wont work for us, so avoid OTG cables if you’re recycling a USB plug. We’ll use a 68k resistor so the camera identifies our cable as a remote shutter. The design we’re going for is shown below in an Eagle schematic for clarity.


The problem here is that it’s not so easy to get at pin 4 of most micro USB plugs. If you buy one from Digikey you’ll see it’s no problem but if you’re trying to recycle a plug from a spare micro USB cable, it can be quite challenging. I’ve done it  and it’s possible but can be very frustrating and not all plugs even have pin 4 exposed!

First pull all your parts together and identify pins 5 and 4 on your USB plug. On this one, pin 1 is on the far right with the USB plug facing up.



Then add a bit of solder to pins 5 and 4 to get them ready for the resistor.



This step isn’t as hard as it sounds – with the soldering iron in one hand and tweezers in the other, pick up the resistor and get it onto pins 5 and 4 where it actually touches both pins – it fits almost perfectly in this position using this USB plug. Now melt some of the solder that is already on Pin 5 (or Pin 4, depending on how you’re holding it), just enough so that it makes a solid joint and holds the resistor in place. Then flip it over and add more solder to the other pin. Don’t spend too long on that one or it will heat up the other side and the resistor will fall off.



Easier than you thought, right? Make sure both ends have strong joints and plenty of solder, being careful not to get it too hot and let it melt off the other side. Now grab the servo cable and cut off the female end. Splice out the ground and signal wires. You can also cut back the power wire now.



Add some solder to these wire ends to prepare them to be attached to the USB connector. Make sure there’s a bit extra so you can make the joint without a third hand for applying solder. According to the schematic, we have to attach ground (black wire) to Pin 5.



..And then attach the signal (white) wire to Pin 3.



Now before we do anything else, hook it up to your camera and make sure it works by shorting out the white and black wires – I use my tweezers on the exposed metal pins to do this easily.


Make sure it actually takes a picture and doesn’t just focus. If you solder the signal wire to Pin 2 instead of Pin 3 it will only focus the lens, not take the picture. Now all these wires are pretty close to each other and over time with mechanical stress they could end up shorting. To prevent that, add a dab of hot glue between the connections to keep them spaced for good.



Finally, wrap it up with heatshrink!



Now you can trigger the camera shutter simply by supplying a low pulse to the servo wire plugged into an Arduino. If you’re looking to take aerial photographs using ArduPilot, then the rest is done for you. Just visit the Event 38 Downloads page and download the Aphex firmware and parameter file. This firmware puts out the correct pulse on pin A7 in the row of auxiliary pins on the side of APM approximately every 2.4 seconds. If you want to change the timing or have it respond to R/C input instead feel free to checkout and modify our source code from Github branch AC2.9.1b_NX.

Read more…

RTF Hexacopter from Event 38 - Aphex


I’m happy to announce the release of our first multirotor drone at Event 38, the Aphex aerial photography platform. This hexacopter is based on the 3DR Hexa frame. It comes ready to fly with a 2-axis stabilizing camera mount for taking aerial photographs. We modified the camera mount so it can now take images not only at moderate angles but also straight down for super low altitude mapping missions. You can easily make small mosaics with resolution less than 0.5cm/pixel. Aimed straight forward, the Aphex can be used to take amazing oblique aerial shots of houses, landscapes and large structures.


The Platform:

The Aphex is based on the 3DR Hexa-B frame using the larger 880kV motors and 11x4.7 propellers. APM2.5 with uBlox GPS are included. The pre-configured tilt/roll stabilized camera mount keeps your cameras up to 500g shooting straight and level. Adjust the tilt axis any time from your R/C controller. The 3DR battery monitor warns you when the battery is low on charge.



The Possibilities:

-          Mapping – Great for small but extremely high quality maps, up to 20 acres per flight

-          Inspection – Bridges/Buildings/Rooves

-          Photography – Get a vantage point above pole cameras and below helicopters

-          Planning – Construction, Mining, Logging, any large scale outdoor operation can benefit from a periodic, "big picture" vantage point.


For those who want really high quality images from an aerial vantage point and don’t like the distorted look and curved horizon you get from GoPro, we’ve sized the Aphex to carry larger traditional cameras. Mount your own or choose one of our Canon Powershot SX260s for medium resolution (12.1MP) and to have your images automatically geotagged. For very high quality images, we offer an automatic trigger system for Samsung mirrorless cameras, the NX1000, NX20 and NX210. These cameras are a great blend of DSLR quality images (20MP APC-C sensor) and lightweight bodies compatible with aerial use.

Check out some of our images below or see more here:



Click here to read more about the specifications or to order now.


Please leave your questions or comments below and also feel free to message me directly through DIY Drones or by email at If there's any interest, I'll make another blog post detailing how to trigger Samsung NX cameras from APM.

Finally I'd like to thank the 3DR and uDrones teams and especially the ArduPlane, ArduCopter and Mission Planner dev teams, thank you!!

Read more…

Using UAS for Environmental Studies


At Event 38, part of our goal of offering low-cost, ready to fly UAS is to promote their adoption into a variety of alternate uses. We recently partnered with the physics and engineering departments of Universidad de Chile to explore the viability using an E382 to study low altitude atmospheric effects.

As a preliminary study, we set out to measure the size of the boundary layer over Santiago, Chile as it grows over the morning hours. In the graphs below, you can see the edge of the boundary layer as the area where the temperature starts to increase at about 910-925 hPa.


It’s only visible in the first two graphs because after that time it had grown beyond the altitude we were testing.

The flights were spaced 20-25 minutes apart and lasted about 15 minutes each. For these missions we used guided mode and simply reset the altitude after a few minutes at each altitude break.


The advantage of using UAS in this scenario is the ability to reach both low and high altitudes quickly and repeatably. Logistically it is much more portable and cheaper per flight than weather balloons or tethered balloons.

Hopefully this is part of the future of small UAS once they’re completely integrated into the airspace worldwide!

Read more…

Reprocessing Old Aerial Photos


I decided to run some older imagery I came across the other day through photoscan and came up with some pretty cool results. Above is the park behind 3d Robotics where I used to fly after work. Below is an aerial shot of the 3DR neighborhood.


And here’s a model of a rural area outside Jakarta on a hazy day made with images taken during a training flight.


If you’re working on aerial photography I encourage you to just get out and start taking pictures, you’ll be surprised what comes out when you’re just flying around randomly! Just don’t fly around former employers’ offices unless they’re into drones too ;-)

Read more…