Patrick Duffy's Posts (12)

Sort by


This blog will describe how to add thermal imaging to your drone without breaking the bank! The above photo shows my drone with the thermal imaging camera, the FlirOne device, which is an inexpensive, lightweight thermal imaging camera designed for Android or IPhone devices.  

A flight test of this setup is shown here

The drone platform is arbitrary, as you could follow the instructions here to add thermal imaging to any drone, either copter or fixed-wing UAVs. The platform I chose is a quad copter that is 3-d printed and the printer files for this model are located here.

The FlirOne camera actually has two cameras; a thermal imaging camera, and an HD (1440x1080) camera for mixing the thermal images with the background. The thermal imaging camera has a resolution of 160x120, with a frame rate of 10fps. This a a relatively slow and low resolution, however, this is a 'budget' camera. Higher resolution cameras are VERY expensive and much heavier which balloons the price of the drone also because you need a larger drone. This camera can be mounted on any drone, as it only weighs 36 grams. 

In my design, I have a 3rd camera, a Raspberry PI 2 Cam, that has a much higher frame rate for FPV viewing in flight, however this is not necessary unless you want a higher frame rate. The FlirOne has an HD camera built in which works perfectly well if you can stand the low frame rate. 

I ended up mounting both the FlirOne and the PI Camera on a 3-d printed dummy GoPro, using these 2-Axis Gimbals

Setup Architecture

The FlirOne is designed to work with Android, however, strapping a phone or android device to your drone is not that practical compared to using a Raspberry PI.  Fortunately, there are some good hackers out there that figured out the FlirOne USB interface and they were kind enough to publish the driver for Linux using Video-For-Linux (V4L2).  In my configuration, I chose to use GStreamer to stream the video over WiFi to the GCS (Ground control station) as shown in the following diagram:


From the above diagram, you can see there are 3 video streams available. If you are using a PI cam, you can disable the second video stream from the FlirOne, or alternatively, use only the FlirOne for both normal video and thermal imaging. 

Step 1) Configure Raspberry PI WiFi

In order to get video from your drone, you will need to have a wireless connection of some kind, presumably WiFi. It is beyond the scope of this blog to describe how to connect a Raspberry PI to a WiFi ground station. Please refer to this link for documentation on how to setup WiFi. 

For my configuration, I am using a Netis 2561 500mW, 5GHZ WiFi Dongle, with an antenna tracker on the GCS. Using this adapter with a tracker gives several miles range, which makes the drone usable for search and rescue situation where thermal imaging is valuable.

Step 2) Install Linux Driver for the FlirOne 

The FlirOne camera has a dual video stream over a single USB connection, so in order to utilize both streams, you need to install the Linux loopback driver to split the single USB source into two Linux devices. To install the loopback driver you will need to build it using the development tools, or download a copy that matches your firmware revision. 

Please refer to this forum for some additional information on how to setup a Linux system for the FlirOne.  To build the loopback driver use the following commands on your Raspberry PI. Note: you must connect your PI to the internet first.

sudo apt-get install linux-headers-rpi
sudo wget -O /usr/bin/rpi-source && sudo chmod +x /usr/bin/rpi-source && /usr/bin/rpi-source -q --tag-update
sudo apt-get install bc

git clone
cd v4l2loopback
make install

Note: installing the headers is optional if you already have an image with the kernel source. The kernel source is not included with the standard Raspbian image.

Next, you must build the FlirOne driver which is located here. Unpack this driver and rum the 'make' command supplied with the package. 

The files associated with this project can also be found here.  A compiled version of the loopback driver and the FlirOne driver is included in this link, however the loopback driver won't work unless you are using the exact same version of the Raspberry PI firmware as my configuration. You will probably need to compile the loopback driver locally. I may publish the entire image if there is interest, but it's 16Gb in size.

Step 3) Configure Startup Script to Start Streams

To make the streams start automatically on boot, I use the startup script "rc.local" in the directory /etc.  I have included a copy of this file in the dropbox folder.

Here are the key commands:

sudo modprobe v4l2loopback devices=5

cd /home/pi/flir8p1-gpl
sudo ./flir8p1 Rainbow.raw &
sleep 5

gst-launch-1.0 v4l2src device=/dev/video2 ! video/x-raw,width=640,height=480 ! jpegenc ! rtpjpegpay ! udpsink host= port=9001 &
gst-launch-1.0 v4l2src device=/dev/video3 ! video/x-raw,width=160,height=128, framerate=1/10 ! rtpvrawpay ! udpsink host= port=9002 &

The first command loads the loopback module, configuring it for 5 possible devices. In my config, I am actually only using 3, but if you happen to have more devices, you will need to allow for them.

The second command starts the FlirOne driver that connects the USB output to the loopback driver. It will run in the background continuously.

The final two commands are to pipe the output of the two FlirOne cameras to the ground station over WiFI. I am using the IP address and port 9001 and 9002 on my GCS PC, however you may want to use your own setup with different IP addresses and ports.

Step 4) Configure your Ground Control Station to Display the Video

The final step in getting all of this to work is the GCS. You must install GStreamer on your GCS. If you are not familiar with GStreamer, you can download an installer here.

To display the standard video stream use this command:

gst-launch-1.0 udpsrc port=9001 ! application/x-rtp,encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink

To display the thermal image, use this command:

gst-launch-1.0 udpsrc port=9002 !application/x-rtp,media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW, sampling=(string)YCbCr-4:2:0, depth=(string)8, width=(string)160, height=(string)128, colorimetry=(string)BT601-5, payload=(int)96 ! rtpvrawdepay ! autovideosink

If all goes well, you should see both video streams on your GCS PC like this:


Happy Flying!

Read more…


Google Cardboard your Drone!

This is a new Android App for running a Heads Up Display on a SmartPhone using Google Cardboard glasses to hold the smart phone.

I have posted a free version on dropbox at

Your phone will need to have side-loading enabled to install the app. Please google on how to side-load an app.

This app will overlay the HUD on a video stream using GStreamer in two distinct panes for viewing in the Google Cardboard viewer.   The app supports both a single and dual video stream, and depending on the processing power of you phone, it' possible to have a stereo view using multiple cameras. 

The App will automatically listen for a MAV-Link data stream on UDP port 14550, the default UDP port for MAV-Link for the HUD telemetry items. To support both the video and telemetry, you will need to supply both data streams to the phone, or for a simple WiFi camera without telemetry, you can connect directly to the camera for live video.

Scenario #1: Simple connection to WiFi Camera


In the App video setup menu, (top-left toolbar) supply this string for the GStreamer pipeline:

souphttpsrc location= ! jpegdec

This string is for the Sony QX10. For other cameras, you will need to know how to connect and get the live stream using a standard GStreamer pipeline. I would suggest experimenting with GStreamer on a PC first to discover the correct string, then use it in the Android app. 

For a Kodak PixPro SL10 camera, I found this string to work:

souphttpsrc location= ! multipartdemux ! jpegdec

In the UI configuration menu (top-left toolbar) select, "Split Image" to get the same image on both viewing panes. This is for a single camera configuration.

Scenario #2: Companion PC (Raspberry PI) with camera and PixHawk/APM Telemetry


There is a blog post currently running about using the Raspberry PI as a Companion Pi 2/3/Zero, so I will not give details here on how to setup the PI to get telemetry via the PixHawk/APM telemetry port, please refer to those pages for instructions.

If you have a Raspberry Pi with a camera, you can use this setup:

On the PI:

raspivid  -t 0 -w 1280 -h 720 -fps 40 -b 4000000 -o - | gst-launch-1.0 -v fdsrc ! h264parse config-interval=1 ! rtph264pay ! udpsink host= $1 port=9000

The "$1" argument is the IP address of your SmartPhone.   This command is for 720p, but you should consider the resolution of the device and adjust accordingly.  It makes no sense to send a 720p video to a phone with a 800x600 display.

In the Android App, configure the gstreamer pipeline with this string:

udpsrc port=9000  ! application/x-rtp,encoding-name=H264,payload=96 ! rtph264depay ! h264parse ! queue ! avdec_h264

For telemetry, I suggest using MAVProxy on the Raspberry Pi, and send the UDP stream to the phone's IP address and UDP port 14550.  The Raspberry PI should be acting as a WiFi access point, and the phone configured with a static ip address.

If you want to display data on a ground station PC and the App at the same time, you can use MavProxy to split the UDP stream, and then connect to your UAV using Mission Planner. 

Here is a sample command that I use to split the data stream with MavProxy on the Raspberry Pi.

mavproxy --master=/dev/ttyAMA0 --baudrate=115200 --out= --out=

In this case, the phone is at IP address (static IP address) and Mission Planner is running on a PC at  Check your phone's documentation on how to setup a static IP for WiFi.  It's usually in the 'Advanced' section of the settings in your phone.  You will also need to configure your Raspberry PI to have a static IP address for it's WiFi access point. The actual MAVProxy settings will depend on how you setup your PixHawk/APM, so choose the correct baud rate you have configured in the flight controller.

Another thing to consider is the processing power of your device. If you see a lot of pixelation, your device is too slow, so you will need to change the resolution of the transmission and/or the bitrate. Some smartphones are just too slow to display the video at full resolution. I have tested 720p on my Google Nexus 6 and it works fine, but on a cheap tablet PC, I had to slow it down and switch to 800x600.  

In part two of this blog, I will explain how to setup the dual video stream and more advanced configurations for high-power/long range FPV using an antenna tracker and the Ubiquity Nano/Rocket M5 ground station access points.

Happy Flying!

Read more…


LAS VEGAS (AP) — Chinese drone maker Ehang Inc. on Wednesday unveiled what it calls the world's first drone capable of carrying a human passenger.

The Guangzhou, China-based company pulled the cloth off the Ehang 184 at the Las Vegas Convention Center during the CES gadget show. In a company video showing it flying, it looks like a small helicopter but with four doubled propellers spinning parallel to the ground like other drones.

The electric-powered drone can be fully charged in two hours, carry up to 220 pounds and fly for 23 minutes at sea level, according to Ehang. The cabin fits one person and a small backpack and even has air conditioning and a reading light. With propellers folded up, it's designed to fit in a single parking spot.

Read more: 

I wonder if it uses PixHawk? Comments? 

Read more…


Here's the :Link

The news release said, "CEO Mark Fields said a goal is to set up a system, for example, where a United Nations emergency worker could launch a drone from an F-150 to survey a disaster scene.

“A drone will actually have the coordinates and be able to then land back into the bed of the F-150, even though it’s in another location,” Fields told WWJ Auto Beat Reporter Jeff Gilbert.

The rapidly deployable surveying system ideally would work like this: A response team would drive an F-150 as far as possible into an emergency zone caused by an earthquake or tsunami. Using the Ford SYNC touch screen, the driver could identify a target area and launch a drone through an appicon1.png. The drone would follow a flight path over the zone, capturing video and creating a map of survivors with associated close-up pictures of each.

Using the driver’s smartphone, the F-150 would establish a real-time link between the drone, the truck and the cloud, so vehicle data can be shared. Data will be relayed to the drone so the driver can continue to a new destination, and the drone will catch up and dock with the truck."

Anybody interested in working on this challenge? 

Read more…


"Drone Buzzes President Barack Obama's Motorcade in Hawaii"

Link here

Some guy, who received a drone for Christmas happens to be flying when Obama drives by, and the media goes WILD. But they ignore the hundreds of cars on the same highway that are a much higher risk to the president. It's going to be a big uphill struggle to get the society to overcome their fear of drones when the media panics like this. 

Imagine what the FAA would do if this toy had crashed into the presidents motorcade? 

Read more…

This post will describe how to control a GoPro camera (usually mounted on a gimbal), via a Raspberry Pi with a USB WiFi dongle. 


The ideal Raspberry PI for this setup is the A+ model because it is small and has the needed single USB port for the WiFi dongle. 

Parts List:

1 GoPro Camera 

1 Raspberry PI (suggest A+ model, but any PI will work)

1 USB WiFi dongle (PI approved) Suggest this one: AirLink N150

2 Jumper wires or 1 servo header that can plug into 1/10th center pins (with center power pin removed)

1 APM or PixHawk board

Of course, all of this is mounted on your Quad or Plane or whatever. On my hexicopter, the PI is powered by one of the ESCs UBEC outputs, where I cut a micro-usb cable and soldered the 'red' and 'black' wires to the power output of the UBEC of the same color. The other two USB wires are not used.


Step 1: Enable the GoPro WiFi access:

The first step in making all of this work is to configure your GoPro camera to accept a WiFi connection. The camera is actually a WiFI access point. 

The details on how to setup your camera is here. If you can connect the GoPro App to the camera, then you can connect the Raspberry PI. You should verify that the GoPro WiFi is working with the GoPro App BEFORE proceeding to connect the PI. You can pick a unique access point name for your camera and use the same name in the Raspberry PI configuration when connecting to the camera with the PI.

Step 2: Connect the Raspberry PI to the GoPro Camera's WiFi:

Once you have established the camera's access point and assigned a name and password, you can make an entry in your PI's network configuration file to configure the connection.

From a shell prompt type the command:  sudo nano /etc/network/interfaces

Add some lines to the file as follows:

allow-hotplug wlan0

iface wlan0 inet dhcp
wpa-ssid "mygopro"
wpa-psk "mypassword"

Reboot the PI and it should automatically connect to the GoPro.

Step 3: Communicate with the GoPro

The GoPro camera should have the standard IP address of If you can ping this address, you have successfully connected to the GoPro so now you are ready to send it commands.

The list of commands are here.  The camera is controlled by sending HTTP request commands in the form of a header string. 

for example, this string turns the camera on:"wifipassword"&p=%01"

Where 'wifipassword' is the password you set when configuring your GoPro.

You can experiment with sending commands with a web browser to get familiar with how to control the camera with web requests or write your own scripts.

Step 4: Use a Python Script to Control the Camera

Attached is a sample python script that will listen for a signal on GPIO pin 5 on the PI header, and send a request to the camera to take a picture when triggered.

To use the code, start the python script after connecting to the camera with the following command:

sudo python3 -photoMode

This will start the script with the camera set to take pictures. If you want it to trigger a video instead, leave out the '-photoMode' option.

Step 5: Configuring APM or Pixhawk to send the signal to the PI

To use this setup with the APM or PixHawk control board, you need to connect the output of the 'relay' pin (A9 on the APM), to the GPIO pin 5 on the PI (or the pin of your choice).  The script is configured to use GPIO pin 5.

Here is a picture of the APM board from This link


The PI pinout is here


Connect Pin 29 (GPIO 5) on the PI to A9 (S) on the APM, and GND pin 30 on the PI, to GND A9 (-) on the APM.  Check the link on the APM website for the PixHawk settings for the relay output pins as I have not used PixHawk (yet).

Step 6: Configure your Radio to Trigger the Camera

In Mission Planner, you will need to select which channel on your radio to assign to trigger the camera input. 

This link describes how to configure the shutter.


Select "Relay" for the Shutter output (not RC10 as shown in the above image example).

Then set the Ch7 option to "Camera" as shown here:


This can also be set on the "Advanced Parameters" setup area.

Step 7: Start Script on Boot:

To make all of this automatic, you can configure your PI to always connect to the GoPro and start the script when the PI boots. Or you can do it manually when you want to fly. 

To make it automatic, you can modify your '/etc/rc.local' file to make the script start on boot. Here is a sample rc.local file:


Once you have completed all of these steps, you should be able to trigger the camera to take a picture with a switch assigned to Ch7 on your radio. 

If you want to GeoTag your images, you can follow the instructions at the ArduCopter GeoTagging page.

Happy flying.

Read more…


This is a new Android App for running a Heads Up Display on a SmartPhone or table computer or Windows. It's available on the Google Play store at

If you want to side-load, I have posted a free version on dropbox at

There is also a windows version available at

This app will overlay the HUD on a video stream using GStreamer, or you can run the app without video with a normal HUD.  

The App will automatically listen for a MAV-Link data stream on UDP port 14550, the default UDP port for MAV-Link. You can also configure the app to use TCP, and on windows you can connect directly to a serial port. 

If you want to display data on a ground station PC and the App at the same time, you can use MavProxy to split the UDP stream, and then connect to your UAV using Mission Planner. 

Here is a sample command that I use to split the data stream with MavProxy:

mavproxy --master=COM5,115200 --out= --out=

The ground station PC is at, and the SmartPhone is at (on my wifi network).  You will have to configure the IP address according to your own WiFi network. My WiFi network in the field is a Ubiquity Rocket M5 configured as an access point, and a second M5 is on the UAV, so I can get live video from the UAV over the same network that connects the smartphone. 

For the windows version, you can run the HUD on the same PC as MissionPlanner (or APM Planner), and use MavProxy with a different UDP port for the HUD.  

If you want to use the video overlay, you will need to have a valid gstreamer data stream running, either over UDP or TCP.  For my setup, I am using a Raspberry PI streaming H264 over UDP using this command on the PI:

raspivid  -t 0 -w 1280 -h 720 -fps 40 -b 4000000 -o - | gst-launch-1.0 -v fdsrc ! h264parse config-interval=1 ! rtph264pay ! udpsink host= $1 port=9000

The "$1" argument is the IP address of your GCS or SmartPhone/Tablet PC.   This command is for 720p, but you should consider the resolution of the device and adjust accordingly.  It makes no sense to send a 720p video to a phone with a 800x600 display.

In the HUD, configure the gstreamer pipeline with this string:

udpsrc port=9000  buffer-size=60000 ! application/x-rtp,encoding-name=H264,payload=96 ! rtph264depay ! h264parse ! queue ! avdec_h264

Do not include a video sink element, as the App will add this dynamically.  You can change the port to whatever you wish as long as your PCs do not have a firewall blocking the port.

Another thing to consider is the processing power of your device. If you see a lot of pixelation, your device is too slow, so you will need to change the resolution of the transmission and/or the bitrate. Some smartphones are just too slow to display the video at full resolution. I have tested 720p on my Google Nexus 6 and it works fine, but on a cheap tablet PC, I had to slow it down and switch to 800x600.  

The App is easy to setup, however if you have any questions or issues, please send me feedback.

Happy Flying!

Read more…


This blog is a continuation of my previous post.

How to build a High-Definition FPV UAV using a Rasperry PI with HD camera, using a high speed WiFi link

This post will discuss how to use GStreamer and Mission Planner together to display the HD video with a HUD (Head-Up-Display).

Note: I have only tested this feature on Windows so the instructions given here are for Windows only. 

To give proper credit, the HUD created here was borrowed from APM Planner, a Qt-Based app similar to Mission Planner. The HUD part was created from the Qt codebase QML HUD created by Bill Bonney who is on the APM Planner development team. To make the HUD work with the background video, I used a GStreamer library called "QtGStreamer" which integrates GStreamer plugins with painting on a Qt widget.  This library is available on the GStreamer website.

The end-result is dynamically added to Mission Planner using the plug-in architecture. 

In the previous posts I discussed used a Raspberry PI and a High-speed WiFi link using GStreamer on the PI and the ground station PC.  To get the HUD to work, you need to already have a successful link with the video on your ground station. 

Here are the steps to follow to install the plugin:

1) Install Mission Planner.

2) Download and install GStreamer from this link.  Use the x86 version, the x86_64 version will NOT work. (Use the default path 'C:\GStreamer' when installing). When installing GStreamer, select 'Custom' install and select ALL plugins to be installed.

3) Follow the steps in the previous blog noted above to get your video stream working.

4) Download and the MSI installer from this link. and run the installer.

If all went well, you should have the plugin installed.

Open Mission Planner and navigate to the "Flight Data" page and right-click on the map. You should see a menu item called "GStreamer HUD" as shown below:


Select this menu item and the following screen should appear:


In the upper-left corner is a context menu. Here is where you enter your GStreamer Pipeline string. If you had the video displaying without the HUD using a valid pipeline, enter it here.

Note: The GStreamer Pipeline string should be exactly the same as the string you used before, but WITHOUT the final video sink element. The video sink is the QtGStreamer element which will be added automatically by the plugin. The GStreamer pipe should therefore be the same, except remove the last element for the video sink.

Here is an example string I used on my setup:

udpsrc port=9000  buffer-size=60000 ! application/x-rtp,encoding-name=H264,payload=96 ! rtph264depay ! h264parse ! queue ! avdec_h264

If all is well, you can connect to your UAV and see the HUD elements moving.  To change the HUD, right click on the display and select which elements you want to display. The default is to display everything shown here. 

If anybody has problems, please post back and I'll update the blog in case I missed something, and you cannot get it to work.

Happy Flying!

Read more…

Part Two: Here is the original picture of the finished product:


This is the second part of a 2-part series on 'How to build a High-Definition FPV UAV using a Raspberry PI with HD camera, using a high speed WiFi link.

In my first post on the subject (located here), I discussed the parts I used, and how to install them into a Hobby King Go-Discover FPV model. 

In this post, I will discuss installing the Raspberry PI and the PI camera in the Go-Discover gimbals, and the software configuration for both the Raspberry PI and the ground station PC.

From the previous post, step 3 was completed by installing the Ubiquity Rocket M5 in the model.  Now onto step 4:

Step 4: Install the Raspberry PI and PI Camera

Here is a photo of the position of the PI in the Go-Discover model:


The PI fits nicely just behind the camera gimbals, with the USB and HDMI ports on top. In the right side you can see the Cat5 network cable attached. This cable connects to the ethernet switch, which is also connected to the Rocket M5 input port.  

The two cables shown on top are the servo control wires for the gimbals, which I have directly connected to channel 4 and 5 on my radio.  I am using channel 4 (normally the rudder stick on my radio. Since there is no rudder on a flying wing, this is a convenient channel to use to move left and right with the camera. I have not (yet) moved to a head tracker, but if you already have that setup, just assign the channels accordingly.

To install the PI camera, remove the stock plate from the gimbals (for a GoPro), and mount the PI camera as shown in this photo:


The PI camera case fits very nicely into the slot, and again I used a small piece of velcro to hold it down. You could use a couple of small screws instead if you want a more secure hold.  The two gimbals servos are also shown here. They are simple to install, just follow the Go-Discover instructions.

Here is a front view of the PI camera installed:


Here is the block diagram describing all the connections:


Some comments on my previous post suggested that it is possible to eliminate the ethernet switch and serial-to-ethernet converter using the Raspberry PI and a serial port on the PI. I believe this post describes how to talk to the PI via the NavLink, but in this case, I want to use the PI to bridge the connection from the ground station to the APM/PixHawk. Somebody please comment on this if you know more about it.   I believe it would require a TCP/IP to serial link from the PI to the telemetry port on the APM, and some software on the PI to act as the bridge.  The main connection to the ground station is via the Rocket M5 and TCP/IP, not through a telemetry link (900 Mhz or Zigbee like I used on my other models).

Step 5: Getting it all to work with software configuration (the really fun part starts now).

Check out this post on what others have done with streaming and the PI.  My experiments showed that using GStreamer on both the PI and on Windows gives really good results with very low latency, if you use the right parameters. 

Get GStreamer on the PI by following this blog.   This is the same version of GStreamer that I am using on my setup. 

Make sure your PI camera works ok by plugging in the PI to a standard monitor using the HDMI port and follow the instructions on the Raspberry PI website on how to get the camera up and running (without GStreamer).  Once you have a working PI and camera, you can then proceed to stream things over the network.  

Note: It is suggested that you first get the PI streaming video by plugging it directly into your local network where you can also connect your ground station PC with the correct IP addresses (without the Rocket M5).   For my PI, I picked,  and for the ground station,    Make sure you can ping the PI from your PC and the PC from the PI.  

For streaming, you will also have to make sure all the ports you intent to use are open on the firewall (described later).

For the ground station PC,  you can download GStreamer here.  Make sure when you install, select to install everything , or full installation (not the default). 

Here is the command I use for the PI to pipe the camera output to GStreamer:

raspivid -t 0 -w 1280 -h 720 -fps 30 -b 1700000 -o - | gst-launch1.0 -v fdsrc ! h264parse config-interval=1 ! rtph264pay ! udpsink host = port= 9000

The command is explained as follows:

raspivid is the command to start the camera capture on the PI.  The -w switch is for the width in pixels, and the -h switch is for the height.  In this case, I am using 1280 X 720, but you can try any combination that fits your needs. 

The -b switch is the bit rate for the sampling. In this case I chose 1.7mbs to send over the stream. Again you can experiment with higher or lower values. This settings seems to work good for me, and the latency is almost unnoticeable.  

the "-o - |" is piping the output to gstreamer.  Make sure you include the dash before the pipe "|" symbol. 

For the GStreamer command, all the filters are separated with an exclamation point "!", as these are individual drivers that are part of GStreamer.  Since the PI has hardware accelerated video, the output is in a format called "H264", which is a highly-compressed stream. The GStreamer filters are configured to transport the output via a UDP socket connection to the target PC. Notice the 'udpsink' element which specifies the host - in this case your ground station, and the UDP port.  I am using port 9000, but you can use any open port on your system, but be sure to open the firewall or it won't work!  You can also use TCP instead of UDP, but for such a data stream, I chose to use UDP since dropouts are certainly possible, and with UDP this is ok, but with TCP, you could have socket problems and higher latency. 

Note: to get the PI to execute this command on boot, make a shell script with the above command and add it to your local.rc boot sequence. That way when the PI boots, you get the stream without having to log into the PI remotely. 

For the ground station PC, once you have installed GStreamer and opened the correct ports, use this command (from the command prompt) to view the stream:

c:\gstreamer\1.0\x86_64\bin\gst-launch-1.0 udpsrc port=9000 ! application/x-rtp,encoding-name=H264,payload=96 ! rtph264depay ! avdec_h264 ! videoconvert ! autovideosink

If all goes well, you should see the PI camera output on your PC screen in a popup window.  For those of you what want to use FPV goggles, you can connect to the HDMI port on your PC to display the output if your goggles support HDMI. 

I have this command in a batch file (with a PAUSE) statement at the end to keep the window open.

WHEW!  If you got this far, you are amazing. 

The last step to complete the build is to connect to the APM from mission planner.  The method I used to connect was to install a utility that converts a TCP connection to a virtual serial port, but I also think that directly connecting the mission planner to the TCP port will also work, however I have not tried it. I will post back later after trying it.

Here is the link to setup the serial to ethernet device to have an IP address and port.

Here is the link to the configuration utility for installing the virtual serial port.   

Once you have a serial connection over TCP/IP working to the APM, you should be able to connect with Mission Planner. On the maiden flight, it worked perfectly, and I didn't see a single drop in the telemetry data or anything noticeable in the video transmission, however my first flight was limited to 2km.

The last step is to connect the Rocket M5 to the Nano M5 and test everything using the OTA (over the air) connection. If all is well, you are ready to fly!  But be careful on your maiden, you just spent $700. 

Finally, here is a photo of my Antenna Tracker with the Nano M5 attached. My next update will include a video of a longer flight.  


Happy Flying!

Read more…

After 3 months of research, picking a platform, and building my own HD FPV rig, I decided to write this blog to share my experience. I have been flying FPV for about a year now, and this was the most fun project yet! 

Here is a picture of the finished product:

3689618661?profile=originalThe platform I picked was the Hobby King 'Go Discover' FPV flying wing. There are many other possible platforms, but to accomplish my goal of creating an HD platform, this bird had the space and wing loading that would accommodate the components. I like the flying wing platform because of the simplicity of the servo setup - only two channels needed for flight. This made it possible for me to use a 6 channel radio and receiver. The Go Discover also has the camera gimbals for changing the viewing perspective in flight, and since I picked  flying wing, I had enough channels left to control the gimbals and the ArduPilot flight modes with a 6 channel radio. My previous FPV plane was the Phantom FPV (also from Hobby King), but its camera space is limited and there was not enough room in the main fuselage to fit all the electronics. The Go Discover turned out to be just right for everything to fit, but with nearly no room to spare. The final rig weighed in at 4.5 lbs, using a 4000mAH 4 cell lipo. 

The final cost was around $700, (not including the antenna tracker, and WiFi receiver), but for full HD, this was not too bad in my opinion, since some HD systems cost this much without the plane and other electronics.

Parts List:

Go Discover Kit ($118)

You can save money by using the PNF version, however I chose to build my own and choose the motor/ESC/receiver combination.

For the motor I picked this 700kv Outrunner ($36) It barely fits, so don't go bigger.

Turnigy Plush 60amp ESC ($35)

4000mAH 4 Cell Lipo ($37)

ArduPilot APM2.6 with GPS ($74)

Ethernet Switch Module ($16.50)

You will need the Ethernet Switch if you want telemetry over the same WiFi connection you will use for the video. To accomplish this, you will need a 'Serial-to-Ethernet converter', which can be purchased here:

Serial-to-Ethernet converter for APM Telemetry over TCP/IP ($22)

Power Supply for Raspberry PI ($7) (Optional if you want to power everything from the main battery).

1000 mAH battery to power the PI and ethernet switch ($11) (Optional if main battery used to power everything)

Raspberry PI with Case ($40)

PI Camera ($25)

Rocket M5 WiFi 5GHZ ($89) Note: you will need to also buy an access point (Nano M5 is here)

I am using the Nano M5 as the access point and an antenna tracker that I built. See my post here on how to build the tracker, or buy your own.

5.8 GHz Planar Wheel RHCP Antennas, you need 2 for MIMO connection using the Rocket M5 ($90) Less expensive antennas may work, but I have good results with these.

6 Channel 2.4 GHz receiver ($13) (Your choice of radio/receiver combo, my radio is a Turnigy 9xR)

2 Digital Servos for the wings ($40) I don't like to skimp on these. These are very good servos.

2 Analog Servos for the Gimbals ($10)

3, 1 Ft, ethernet cables ($7) (One for PI, one for Rocket M5, and One for Serial-to-Ethernet for Telemetry)

Power-over-ethernet cable for the rocket M5 ($5).  You can buy these at any electronics outlet or google it.

Misc RF connectors (Right angle SMA, and a short RG59 cable with SMA) ($10).

Total for Plane: $671.  + Ground station access point ($89),  $760.  I am not including the cost of the antenna tracker, but from my blog, you can build one for about $250.  The grand total should be under $1K, not including your ground station, but I assume you already have one.

Now for the fun part - assembly. I will describe my assembly for the Go Discover, but the process will be similar for any platform you choose. You will need to arrange the parts as you can to get everything in, and also achieve the correct CG. It took a bit of 'trial and error' for me to find the correct combination.

Step 1: Install the motor and ESC.

I chose to build from the back forward, installing the motor, ESC, then battery/electronics, then finally the Raspberry PI and camera. 

From other blogs about the Go Discover, the motor mount is very weak, so I chose to 'beef' it up by adding some EPO foam and epoxying a metal mount designed to support the SK3 motor. Here is a photo of my setup with the ESC and motor mounted in the rear:


Notice the area just in front of the motor mount where I added some extra foam to fill the space that exists in the stock model. I used some old EPO foam from a junk model and glued it into place. The ESC fits nicely right behind the battery. The battery is installed using Velcro to hold it in place. 

Unfortunately, the amount of electronics that exist in this space made it difficult to just swap the battery, but I could not find a way around it, so I leave the battery in place and charge it in the plane. Of course this limits the number of flights per day, but for such an advanced toy, it's a trade off.

Step 2: Install the ArduPilot, GPS and Serial-to-Ethernet converter.

In the middle section, there is a trapezoid-shaped area that I decided to use to mount the Ardupilot, GPS and Serial converter. Here is a picture of the installation:


In the trapezoid area, I cut out some cardboard that matches the shape, and placed it over the battery. I then mounted the 3 components on top of the cardboard cutout. The view of this photo is with the tail of the plane on the top. On the right side of the plane is the GPS module (with some duct-tape holding it in place). It's the purple square-shaped PCB. I used Velcro under the unit also, and for all the other components. 

In the center is the Serial-to-Ethernet converter. To connect it to the Ardupilot, you will need to modify the connector that is normally used to connect to the Telemetry port with the 900mhz transceiver that you can buy from 3-d robotics. There are 4 wires: ground, power, rx an tx.  You can find a schematic here.  I will explain later how to make this work with your PC software (download from the internet). The Serial-to-ethernet converter has the pinout silk-screened on the PCB and it's pretty obvious were to connect the wires. Make sure you connect TX to RX and RX to TX from the converter and the Telemetry port on the Ardupilot. 

Allow the pins of the serial converter to hang over the cardboard so you can plug in the connector, and velcro the unit to the cardboard. 

Finally, on the left side, you can mount the ardupilot.  Connect as usual to the GPS port, and radio. Please refer to the Ardupilot website on how to configure an connect your Ardupilot. I will not cover that portion here. 

Step 3: Install the Ethernet Switch,  Rocket M5 and antennas:

There is a bit of work to do here because the Rocket M5 comes in a big (sealed) plastic case. You MUST cut it out of its case, and you WILL void the warranty. Use a dremmel tool and CAREFULLY cut around the edge of the case (not across the top), and remove it from the plastic case.

It is highly suggested that you get the Rocket M5 and Nano M5 talking to each other BEFORE you install the M5 in the model. Follow the instructions on the Ubiquity website on how to do this. It's basically the same as connecting two WiFi terminals. Each unit has a built-in webpage for configuration. Configure your Nano as an "Access Point" and the Rocket as a client.  You should be able to "ping" both units from your PC and connect the Raspberry PI to the network and ping it also. For my Raspberry PI, I set the IP address of the ethernet port to, and the ground station PC address to  You will also need to set the IP address of the Rocket M5 and the Nano M5 when you configure them. 

Here is a photo of the position of the Rocket M5 installed in the plane after removal from the case:


Note: this view is the front of the model pointing to the right. The ethernet switch is mounted just in front of the trapezoid cutout that was used to mount the ArduPilot. It is mounted with the PCB vertical in the fuselage. Notice the Ethernet cables connected to it in the left side of the above photo. Again, I used Velcro to mount the ethernet switch.  Notice the gray cable. This is the POE (Power over Ethernet) cable that you will need to use to power the M5. You can get one of these on EBay or Amazon.

There are TWO RF connectors (left side of plane) on the Rocket M5 because this is a MIMO setup (Multiple Input, Multiple Output), which is the same technology used in LTE phone networks. The two antenna configuration gives this radio higher throughput, more range, and better SNR, all of which makes this setup work very well.

You should mount the antennas apart from each other, and for my setup I decided to place them on either side of the plane just under the wing. Since the Go Discover has a pretty big fuselage, the antennas don't touch the ground and are cleared by a couple of inches, so unless you go nose down on landing, they should be safe.  Notice the RF connector on the top of the photo. This is the 2nd antenna. The first antenna is mounted directly below the first RF connector on the left side of the plane.

Also, not shown in this picture is the 2nd battery (1000 mAH) and the control receiver, that are mounted under the Rocket M5. The power supply for the Raspberry PI is also mounted in this space, just behind the Rocket M5. 

Note: to hold the Rocket M5, I created a couple of EPO foam mounts to make sure it didn't move, and glued them in place under the M5, and then I used Velcro to hold it down. 

Here is a photo of where I installed the antennas:


Notice the two white posts on either side. These are the 5GHz planar wheel antennas. 

Here is another set of photos showing the RF connectors:



Notice also in the above photo, on the left side is the radio receiver (Orange RX DSMX type for my radio). Out of view (under the M5) is the power supply used to power the Raspberry PI. You can mount this anywhere, but this is a convenient spot.  The placement of the radio is also convenient to connect the servo wires to the ArduPilot.

Step 4: Installing the Raspberry PI and Camera

(More to come)

Read more…


For anybody interested in improving the range of your FPV and the quality of the signal (and still stay FCC legal), and antenna tracking could be for you. After looking around for a 'stock' solution I decided to build one myself. Above is a photo of the final product.

I am using the 5.8Ghz band for my FPV configuration, and this keeps the antenna small and the entire package very compact and portable. There are two stock full-size servos and a servo base that I purchased from

I will not say that this solution is 'cheap', but it is very effective. I have yet to fly my plane out of range where I lose the  video so I cannot say what the exact range is, but I have not gone beyond 2 miles using a 500mw transmitter. The receiver antenna is a spiral RHCP (Right-hand-circularly polarized) antenna which has 16 turns! With this configuration the gain is 16dbi, which is excellent, but it's also very directional, and hence the need for the tracker. 

The total cost of this rig was about $280 (plus the work to assemble it).   Half the cost is the servo and the servo mounts and gears. The other half is the antenna and the Arduino. 

The link to the antenna is here (ReadyMadyRC). There are a number of other antennas that will work with this tracker as long as the footprint is not too big, but for 5.8ghz this is usually not an issue. To attach this antenna to the servo, I simply used a strip of Velcro so it can be easily removed. It is light enough that the Velcro holds it perfectly.

Here is the parts list for the servos: (

Qty  Description                             Unit                Amount
1     (DDT500F) Direct Drive Tilt (Futaba)   $24.99              $24.99
       * Weight: 0.22 lbs. each

1     (SPG5485A-BM-36005U) Bottom Mount      $99.99              $99.99
         Servo Power Gearbox - 360 Degree
       * Weight: 0.62 lbs. each
       * Option: Servo is HS-5485HB          $24.99              $24.99
       * Option: Ratio is 5:1 Metal Gears
       * Option: Kit is Unassembled

1     (FUTM0718) (03.06.02) S3072HV Servo    $39.99              $39.99
       * Weight: 0.18 lbs. each

4     (91772A146) 6-32x3/8 inch  Pan Head    $0.17               $0.68
         Phillips Machine Screws (Stainless
       * Weight: 0.01 lbs. each

     Total Shipment Weight: 1.23 lbs.   Subtotal:                $190.64
                                        Shipping & Handling:     $6.99
                                        Total:                   $197.63

You can choose different servos if you prefer to spend the money (or go on the cheap), if you can find them elsewhere, they are just standard size servos.  

You can also buy the kit with the servo included and pre-assembled, (additional $30 cost). Here is a link to the servo kit

If you are not good with a soldering iron, pay the extra. You will need to disassemble one of your servos, disconnect the potentiometer  wires, and re-solder them to the external one supplied in the gear box. The external feedback will allow your tracker to move 360 degrees in the azmuth. 

Also, I chose the 5:1 gear ratio, and this seems to work very well, even at full speed, there is no jerking and the tracker tracks the plane perfectly. A lower ration could make it jerky, but it may also work. 

Assembling the servo box is not too difficult but will take you about an hour (plus wiring if you buy the bare kit). I would suggest paying extra for the pre-assemble servo base as I found modifying the servo to be an 'advanced' activity, however I was able to pull it off without losing parts or damaging the servo.  All the instructions come with the kit, so I will not go into detail here on how to assemble the pan/tilt box.

The assembled serve pan/tilt will should look like this:


Part 2: The Arduino and Misson Planner communications to control the servos

Now comes the fun part. Controlling the servos. I am using a standard Arduino Uno board which you can purchase just about anywhere. Here is a sample link, Arduino Uno that cost $25. 

I also added the 'Servo Shield' to my Arduino, however I think it's possible to just wire the pins directly to the Uno without the shield, but the shield has the standard servo pins.  Here is the link to the servo shield.

To connect the servos to the Arduino, just use the standard connectors that come with the servo. No wiring necessary if you use the Servo Shield. You can pick any servo port on the Arduino, just take care to make sure the software is using the pins that you have selected to control the servos.

The ground wire on the servo (usually black or brown) should go toward the "G"  and the signal side (usually yellow) goes to the "S" label on the Servo Shield. If you do not use the shield, you will have to solder the wires or install your own plug on the Arduino.

For the servo power, I have chosen to use the power supplied by the Arduino through the USB port and for two standard size servos, this seems to work fine with a lightweight antenna. If the load becomes too big, you could trigger the on-board fuse, so in that case you may consider using an external power supply for the servos. So far, I have not had an issue with just using the power from the Arduino and the computer's USB port which will supply 500mA. 

Here is an image of the shield (use the digital servo ports shown labeled GVS):


Connected to the Arduino (and enclosed in a metal case), the assembled unit looks like this:


Part 3: Make it work with Mission Planner:

So after all this 'hardware' assembly, you will want to test it out to see if it works with mission planner. BUT you will need some software to run on the Arduino.  Here is the source code that I used:


Everything is done over the Arduino serial port when Mission Planner connects to the Arduino. I picked to hard-code a serial port baud rate of 57600.

There are two lines of code that determine which pins on your board will control the servos:


On my board, I used pins 9 and 10 which end up being the first two servo ports on the Servo shield. 

There is one line of code that controls the pan calibration that I figured out using trial and error and some calculations to make the pan go 360 using the 5:1 gear ratio:

   interpVal = ((interpVal * 10) / 37) + 50;



   The inputs (interpVal) are PPM values sent by Mission Planner. The hard part is done in Mission Planner as it computes the PPM servo values from the GPS coordinates. To make it work with this setup, you will need to 'tweek' the values for your servos and the gear ratio of the pan gear box. If you duplicate the setup I have described, you should be able to load the firmware into the Arduino and simply connect the Mission Planner and it should work (in theory!). 

Load the Mission Planner and go to the "Initial Setup" tab, and select "Antenna Tracker" as shown below:


Using the default values shown here, (and your serial port), you should be able to connect and control the servos using the 'Trim' sliders. If everything is working ok, you should see the servos move to their extreme positions (pan = 360 deg), and (tilt = 90). The tilt will go from horizontal to the ground, to vertical straight up. 

If all is well, then you can try it in the field. Just plug it in, and connect, and Mission Planner does the rest.

Happy flying!

Read more…


For anybody interested in using a joystick or flight yoke to pilot your ArduPlane, I have successfully configured the CH-Products 'Flight Yoke' to work with a Spektrum DX6i using a PPM converter and a script written for the yoke that will allow you to change flight modes from the yoke using the buttons and the programming language supplied with the yoke driver.

The yoke can be purchased from here

I paid $85 for one and it works great. There is a newer model, but for basic flying, this one has plenty of buttons and knobs to get the job done.

The PPM converter that I used can be purchased here

This is the 'CompuFly' ppm converter for use with JR transmitters, but it also works with the Spektrum DX6i (and I assume other Spektrum transmitters), by flipping the polarity to 'positive' PPM.

Part 1: Configuring the Flight Yoke:

The main screen of the flight yoke software panel looks like this:

You should test the joystick with your flight simulator before you try making it work with the PPM converter. My test was done using X-plane. It's as simple as following the wizard, and proceed to the 'Test/Calibrate' screen to get the yoke setup for normal windows joystick operation.

Once you have added your yoke to the CH Control Manager, you can then proceed to configure each axis to control your plane. My setup has the following configuration:

Joystick X-Axis (A1) =  Roll 

Joystick Y-Axis (A2) = Pitch 

Joystick Z-Axis (A3) = Throttle

Joystick U-Axis (A4) = Yaw

Joystick Slider-0 = Gear channel (Button 5 and 6) 

Joystick R-Axis = Button modes:

Button 9 = FBWA mode

Button 10 = AUTO mode

Button 11 = RTL mode

Button 12 = MANUAL mode

The modes are (of course) going to be whatever you have them set to in your ArduPilot, but these are the ones I have in my default config. The resolution of the modes can also be modified and more buttons added to the script to achieve more mode possibilities, but my DX6i is setup for 4 modes. 

For the Yaw axis, there is no actual joystick input because I do not have the pedals, so I chose to use a mixing mode and added the Roll channel to the U-axis output in a script to simulate the rudder working with the roll channel. The rudder mixing will apply when flying in MANUAL mode. The ideal simulator would also have the pedals which are optional. You can buy these as an add-on and use them with the yoke and create a single simulated joystick.

I will describe how to setup the script to achieve the rudder mix and button mode code to make this setup work with the ArduPilot for those that do not have the pedals. 

To customize the output of the yoke, the yoke must be configured to run in 'mapped' mode as shown in the following screen:


The "CMS Controls" tab will allow you to select which joystick channel will be used in the scripts and which joystick output will be assigned to that channel. 

My configuration is as follows:

CMS.A3 = R Axis (Button Modes)

CMS.A4 = U-Axis (Yaw Channel) The output is computed in the script. 

CMS.A5= Slider-0 (Gear Channel on radio)

To make these channels programmable, check the "DX Mode" box as shown above.

My transmitter is a 6 channel device, however if yours has more, you can just pick additional CMS inputs to use for the other transmitter channels, and assign them accordingly

On the "FS Yoke" tab, the first three channels should be assigned to Roll (X-Axis), Pitch (Y-Axis), and Throttle (Z-Axis) respectively, and the 'DX Mode" box checked for these channels. This will force the yoke to pass these channels through to your transmitter without any software input. If you want to modify the behavior of these channels in the script, then you must uncheck this box on the 'FS Yoke' tab, and check the box on the CMS Controls' tab for that axis. 

Creating the Script:

The script is actually very simple. For the rudder mix, it's one line of code:

 CMS.A4 = JS1.A1/4+32;

This will give you a 25% rudder movement compared to the roll when flying in MANUAL mode. You can play with the mix as you desire. If you are flying in FBWA, you probably have some rudder mix, so it's probably a good idea to match that here. If you don't want any rudder mix, delete this line of code and the rudder will stay fixed in MANUAL mode.  For stunt flying or 'knife edge' tricks, you may want to get the pedals, or try assigning the 'prop' or 'fuel mix' channel to the rudder and use the slider to control the rudder - this is advanced, and I have not attempted such a configuration but it's possible to setup using the scripts.

Here is the entire script:

// CMS Script File
// Game Title:
// Written By:
// Date:

CMS.A4 = JS1.A1/4+32;

CMS.A3 = 0;

CMS.A3 = 110;

CMS.A3 = 190;

CMS.A3 = 255;

CMS.A5 = 0;

CMS.A5 = 255;


To cut and paste the script into the joystick, press the "CM Editor' button on the toolbar and paste it in. Then press the "Download" button as shown in this screenshot:


Once you have 'downloaded' the configuration, you can then test it. You should see the following screen after you enter the test mode:


Moving the yoke and observe the channel outputs.  "Roll" should move the X and the U channels. The U channel is the rudder mix. The "Pitch" should move the Y-channel, and Throttle should move the Z channel. 

The R-channel should be changing when you press the buttons that you have configured to control the modes. On my configuration, buttons 9 -12 control the modes.

Lastly, buttons 5 and 6 should control the S0 (Slider 0, Gear channel). On my plane, this turns on the camera for FPV.

Part 2: Configuring the PPM converter

The CompuFly USB PPM converter has a single screen configuration as shown as follows:


For the Spektrum DX6i, set the modulation to 'Positive PPM', and 6 channels. If you have a different radio, you will have to check the documentation or experiment. For my radio, I have channels 1-4 set to INVERT, (inv check box). These are the surface and throttle control channels. You will need to experiment to get these right for your plane and transmitter depending on how your servos are configure with the ArduPilot. For the 'flight modes' and 'Gear' channels, my configuration is set to non-inverted.

The key point of the configuration not shown here is the Joystick-to-channel mapping which is in the (settings.ini) INI file stored in the same directory as the CompuFly.exe application.

Here is my configuration:






#Channel Axis Numbers
# X axis = 1
# Y axis = 2
# X axis = 3
# RX axis = 4
# RY axis = 5
# RZ axis = 6
# S1 axis = 7
# S2 axis = 8
# HTX axis = 10
# HTY axis = 11
# SWC axis = 12


You can cut-and-paste the previous text directly into your INI file. Take care to test this completely with your plane on the ground and the prop removed! If you get the throttle channel backwards, the motor could go full on you. Be careful here.

To test your setup, plug in the PPM converter to the back of your transmitter in the 'Trainer' port, and turn on your transmitter and power up your plane. Flip the 'trainer' switch on the DX6i.  Move things around and see what happens! For my transmitter, I replaced the trainer switch with a more robust toggle switch. The original one broke off. 

If everything is good, you should be able to control everything using the Flight Yoke, including the flight modes. 

Happy Flying!

Read more…