Google Cardboard Head-Up Display for Android


Google Cardboard your Drone!

This is a new Android App for running a Heads Up Display on a SmartPhone using Google Cardboard glasses to hold the smart phone.

I have posted a free version on dropbox at

Your phone will need to have side-loading enabled to install the app. Please google on how to side-load an app.

This app will overlay the HUD on a video stream using GStreamer in two distinct panes for viewing in the Google Cardboard viewer.   The app supports both a single and dual video stream, and depending on the processing power of you phone, it' possible to have a stereo view using multiple cameras. 

The App will automatically listen for a MAV-Link data stream on UDP port 14550, the default UDP port for MAV-Link for the HUD telemetry items. To support both the video and telemetry, you will need to supply both data streams to the phone, or for a simple WiFi camera without telemetry, you can connect directly to the camera for live video.

Scenario #1: Simple connection to WiFi Camera


In the App video setup menu, (top-left toolbar) supply this string for the GStreamer pipeline:

souphttpsrc location= ! jpegdec

This string is for the Sony QX10. For other cameras, you will need to know how to connect and get the live stream using a standard GStreamer pipeline. I would suggest experimenting with GStreamer on a PC first to discover the correct string, then use it in the Android app. 

For a Kodak PixPro SL10 camera, I found this string to work:

souphttpsrc location= ! multipartdemux ! jpegdec

In the UI configuration menu (top-left toolbar) select, "Split Image" to get the same image on both viewing panes. This is for a single camera configuration.

Scenario #2: Companion PC (Raspberry PI) with camera and PixHawk/APM Telemetry


There is a blog post currently running about using the Raspberry PI as a Companion Pi 2/3/Zero, so I will not give details here on how to setup the PI to get telemetry via the PixHawk/APM telemetry port, please refer to those pages for instructions.

If you have a Raspberry Pi with a camera, you can use this setup:

On the PI:

raspivid  -t 0 -w 1280 -h 720 -fps 40 -b 4000000 -o - | gst-launch-1.0 -v fdsrc ! h264parse config-interval=1 ! rtph264pay ! udpsink host= $1 port=9000

The "$1" argument is the IP address of your SmartPhone.   This command is for 720p, but you should consider the resolution of the device and adjust accordingly.  It makes no sense to send a 720p video to a phone with a 800x600 display.

In the Android App, configure the gstreamer pipeline with this string:

udpsrc port=9000  ! application/x-rtp,encoding-name=H264,payload=96 ! rtph264depay ! h264parse ! queue ! avdec_h264

For telemetry, I suggest using MAVProxy on the Raspberry Pi, and send the UDP stream to the phone's IP address and UDP port 14550.  The Raspberry PI should be acting as a WiFi access point, and the phone configured with a static ip address.

If you want to display data on a ground station PC and the App at the same time, you can use MavProxy to split the UDP stream, and then connect to your UAV using Mission Planner. 

Here is a sample command that I use to split the data stream with MavProxy on the Raspberry Pi.

mavproxy --master=/dev/ttyAMA0 --baudrate=115200 --out= --out=

In this case, the phone is at IP address (static IP address) and Mission Planner is running on a PC at  Check your phone's documentation on how to setup a static IP for WiFi.  It's usually in the 'Advanced' section of the settings in your phone.  You will also need to configure your Raspberry PI to have a static IP address for it's WiFi access point. The actual MAVProxy settings will depend on how you setup your PixHawk/APM, so choose the correct baud rate you have configured in the flight controller.

Another thing to consider is the processing power of your device. If you see a lot of pixelation, your device is too slow, so you will need to change the resolution of the transmission and/or the bitrate. Some smartphones are just too slow to display the video at full resolution. I have tested 720p on my Google Nexus 6 and it works fine, but on a cheap tablet PC, I had to slow it down and switch to 800x600.  

In part two of this blog, I will explain how to setup the dual video stream and more advanced configurations for high-power/long range FPV using an antenna tracker and the Ubiquity Nano/Rocket M5 ground station access points.

Happy Flying!

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones


  • There's an update to the app on the dropbox.   A few new features like stream swapping has been added.

    Shared with Dropbox
  • Managed to get the Sony QX1 working with:

    souphttpsrc location= ! jpegdec

    The difference between the QX1 and QX100 is the IP, 8080 & /liveviewstream added.

  • Thanks Patrick that's great! No they don't show in the original either. I'm sure the scroll bars for contrast etc were working awhile ago, must have lost them on a firmware update.
  • I have posted an update to the original HUD which includes the ability to swap the colors.

    Shared with Dropbox
  • @Glenn, not sure why the scroll bars don't show.  Do they show on the original single stream HUD?


    I'll post an update to the old HUD in a couple of days to fix the color matrix issue.

  • Hi guys thanks for the tips. But there seems to be something wrong on my phone. None of the scroll bars are visible (see screenshot).

    Also Patrick is it possible to swap the color matrix on your original single stream HUD?3702270207?profile=original

  • @Jon, good point, I forgot to mention that portrait mode works better because it doesn't hide the scroll.

    Also, one other feature that I didn't mention, if you have strange colors, you can swap the color matrix and usually get the correct matrix. BT609 is the default, but for the Raspberry PI and other H264-encoded streams, the gstreamer components do not report the correct value for the matrix, so you have to manually swap it.

    Just select 'Swap Color Maxtrx' from the UI menu.



  • @Glenn Gregory

    On my phone (Sony Aqua M4) I found it easier to enter and amend input errors to the pipeline in portrait mode. The pop up scroll button was hidden by the virtual keyboard in landscape orientation.

    The Sony QX100 also works with

  • @Glenn,  one other note on the pipeline strings. If you enter a second string, it will save it, and you can select between previous strings by clicking on the dropdown button near the delete button.

  • @Glenn,  On my phone, I have Android Lollipop, and I can scroll by pressing on the text and a button pops up which you can slide through. Which version of Android are you running?   Those string are actually stored in a Qt settings database so you can't directly edit them, but I may change the UI to make it easier.


    Possibly in the next release I could add a 'swap stream' feature. 


This reply was deleted.