This blog is a continuation of my previous post.
How to build a High-Definition FPV UAV using a Rasperry PI with HD camera, using a high speed WiFi link
This post will discuss how to use GStreamer and Mission Planner together to display the HD video with a HUD (Head-Up-Display).
Note: I have only tested this feature on Windows so the instructions given here are for Windows only.
To give proper credit, the HUD created here was borrowed from APM Planner, a Qt-Based app similar to Mission Planner. The HUD part was created from the Qt codebase QML HUD created by Bill Bonney who is on the APM Planner development team. To make the HUD work with the background video, I used a GStreamer library called "QtGStreamer" which integrates GStreamer plugins with painting on a Qt widget. This library is available on the GStreamer website.
The end-result is dynamically added to Mission Planner using the plug-in architecture.
In the previous posts I discussed used a Raspberry PI and a High-speed WiFi link using GStreamer on the PI and the ground station PC. To get the HUD to work, you need to already have a successful link with the video on your ground station.
Here are the steps to follow to install the plugin:
1) Install Mission Planner.
2) Download and install GStreamer from this link. Use the x86 version, the x86_64 version will NOT work. (Use the default path 'C:\GStreamer' when installing). When installing GStreamer, select 'Custom' install and select ALL plugins to be installed.
3) Follow the steps in the previous blog noted above to get your video stream working.
4) Download and the MSI installer from this link. and run the installer.
If all went well, you should have the plugin installed.
Open Mission Planner and navigate to the "Flight Data" page and right-click on the map. You should see a menu item called "GStreamer HUD" as shown below:
Select this menu item and the following screen should appear:
In the upper-left corner is a context menu. Here is where you enter your GStreamer Pipeline string. If you had the video displaying without the HUD using a valid pipeline, enter it here.
Note: The GStreamer Pipeline string should be exactly the same as the string you used before, but WITHOUT the final video sink element. The video sink is the QtGStreamer element which will be added automatically by the plugin. The GStreamer pipe should therefore be the same, except remove the last element for the video sink.
Here is an example string I used on my setup:
udpsrc port=9000 buffer-size=60000 ! application/x-rtp,encoding-name=H264,payload=96 ! rtph264depay ! h264parse ! queue ! avdec_h264
If all is well, you can connect to your UAV and see the HUD elements moving. To change the HUD, right click on the display and select which elements you want to display. The default is to display everything shown here.
If anybody has problems, please post back and I'll update the blog in case I missed something, and you cannot get it to work.
Happy Flying!
Comments
@Ben, I forgot to answer you question about the recording. If you want to control the RPI via a switch, this is very possible. Or you can have the RPI listen for commands in the MavLink stream to trigger some event and run a script. In fact, I am experimenting with this now with a GoPro via the RPI WiFi link. I'll be posting instructions on how to do this when I get it all working. Channel 7 on your transmitter can be configured to trigger a relay, which is actually connected to the RPI digital input pins. There is a service running on the PI that will trigger a photo capture from the PI camera, or an external camera like the GoPro. Since the GoPro is wifi capable, you can connect it to the PI (configured as a Wifi access point), and it can be controlled remotely. The script can also record the GPS coordinates for geo-tagging the images. For the PI camera, you can geo-tag the images on the fly, but with the GoPro you have to run some post processing.
@Ben, My last response was a bit confusing. You still need a GCS computer, just not GCS software. If you are sending analog video, you won't need the RPI. The entire setup described here assumes you are using a digital link with either WiFi or a MIMO configuration using Ubiquity hardware (or similar setup).
For all of my multi-rotor projects I now use a WiFi link with an antenna tracker. I tested my quad yesterday to 1.5 miles with an ordinary wifi adapter I bought at Fry's electronics, and I still had around 70% RSSI. I think I can go to 3+ miles with this setup. 3-d robotics is advertising that they can get 1/2 miles out of their wifi link, but that's without a tracker and high-gain antenna.
My advice would be to try WiFi, but that's up to you.
Thank you for the incredibly quick reply! However, I will not be using WiFi at all in the build, just sending the video signal with a 5.8Ghz transmitter. Would all I need to do then is run the stand-alone version of the HUD on the RPI and output the video to the transmitter?
Also, would I be capable of mapping "take a picture" and "start/stop recording" to switches on my controller?
FYI, this is the link to the stand-alone version of the HUD that will work without MissionPlanner.
https://www.dropbox.com/s/ssfxi5kwh93c0iu/GStreamerHUD.msi?dl=0
@Ben, If you are using a Raspberry PI with a webcam, GStreamer will do the job for both the PI camera and a webcam. I have experimented with the Logitech C920 using GStreamer and a good number of folks in the community use it successfully with GStreamer and the PI.
You don't need any GCS to use the HUD. The HUD will listen for MavLink messages on UDP port 14550 (the standard port), so if you configure your WiFi connection from your UAV to send UDP packets to this port on your GCS, it should work. I don't think QGroundControl has the HD HUD incorporated, but you can use it with the stand-alone version of the HUD, or with MissionPlanner. The HUD is actually a cut-down version of APM Planner, which was forked from QGroundControl.
Patrick,
I will be attempting something very similar although it might not be as complex and need a little advice. I will be running a quadcopter with Pixhawk and RPI, and would like to stream webcam feed, including a HUD overlay (similar to DJI OSD information) to a live feed monitor via a 5.8Ghz transmitter on the quad. Would I need to run GStreamer on the RPI? Would QGroundControl work? In the future I will most likely have a similiar setup concerning the laptop running Mission Planner, but for now all I need is video/telemetry feed sent to a simple monitor via a 5.8Ghz transmitter.
Any advice would be greatly appreciated, thanks!
@Kevin, Glad to hear you have narrowed it down. I would try uninstalling gstreamer on the misbehaving machine, and then try installing the HUD and see what happens.
Tried a different windows machine (i7) --using same pipeline and same version of gstreamer 1.4.5 and it works just fine. I'm about to write it off as a hardware issue with that box. Thanks for all the help.
I moved to my mac and fired up with the same pipeline i was using on the windows GCS and it works just fine. So I really do think the windows gstreamer setup or the machine itself is to blame. I find that odd since TCP will fire up the window with no issue. It is really not a powerful machine (HD4000 GPU,Intel Celeron 1007u@1.5ghz, 4gbRAM, windows 8.1 ) I may try another win machine and see what happens.
@Kevin, A note on the config-interval. If it's missing, the pipe on the GCS may never start because it's missing the configuration parameters. I have seen this happen on newer versions of GStreamer, where on older releases the config interval default was 1, and now it's 0 which means that the parameters are only sent once at startup.