3689624585?profile=original

This blog is a continuation of my previous post.

How to build a High-Definition FPV UAV using a Rasperry PI with HD camera, using a high speed WiFi link

This post will discuss how to use GStreamer and Mission Planner together to display the HD video with a HUD (Head-Up-Display).

Note: I have only tested this feature on Windows so the instructions given here are for Windows only. 

To give proper credit, the HUD created here was borrowed from APM Planner, a Qt-Based app similar to Mission Planner. The HUD part was created from the Qt codebase QML HUD created by Bill Bonney who is on the APM Planner development team. To make the HUD work with the background video, I used a GStreamer library called "QtGStreamer" which integrates GStreamer plugins with painting on a Qt widget.  This library is available on the GStreamer website.

The end-result is dynamically added to Mission Planner using the plug-in architecture. 

In the previous posts I discussed used a Raspberry PI and a High-speed WiFi link using GStreamer on the PI and the ground station PC.  To get the HUD to work, you need to already have a successful link with the video on your ground station. 

Here are the steps to follow to install the plugin:

1) Install Mission Planner.

2) Download and install GStreamer from this link.  Use the x86 version, the x86_64 version will NOT work. (Use the default path 'C:\GStreamer' when installing). When installing GStreamer, select 'Custom' install and select ALL plugins to be installed.

3) Follow the steps in the previous blog noted above to get your video stream working.

4) Download and the MSI installer from this link. and run the installer.

If all went well, you should have the plugin installed.

Open Mission Planner and navigate to the "Flight Data" page and right-click on the map. You should see a menu item called "GStreamer HUD" as shown below:

3689624376?profile=original

Select this menu item and the following screen should appear:

3689624624?profile=original

In the upper-left corner is a context menu. Here is where you enter your GStreamer Pipeline string. If you had the video displaying without the HUD using a valid pipeline, enter it here.

Note: The GStreamer Pipeline string should be exactly the same as the string you used before, but WITHOUT the final video sink element. The video sink is the QtGStreamer element which will be added automatically by the plugin. The GStreamer pipe should therefore be the same, except remove the last element for the video sink.

Here is an example string I used on my setup:

udpsrc port=9000  buffer-size=60000 ! application/x-rtp,encoding-name=H264,payload=96 ! rtph264depay ! h264parse ! queue ! avdec_h264

If all is well, you can connect to your UAV and see the HUD elements moving.  To change the HUD, right click on the display and select which elements you want to display. The default is to display everything shown here. 

If anybody has problems, please post back and I'll update the blog in case I missed something, and you cannot get it to work.

Happy Flying!

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • Thanks. Let me run some tests. I need to be certain I have the same hardware on both sides.

    For our mission applications, latency is not even relevant. Our concern was that many of the students were not able to get gstreamer working, particularly with the current raspbian.

  • @Ronald,  if you could describe your exact setup, with camera and commands used to get the UV4L stream going, I could experiment with it.

    My experience with streaming from the Raspberry PI, the bottleneck is the encoding/decoding time, not the stream encapsulation. If you have an H264 stream that is RTP over UDP, it should work as it is with the correct pipeline string.  I am not sure where you are gaining to reduce the latency by using a different driver since the encoding/decoding is still the same and about 90% of the latency.

  • Hi Patrick,

    We can use the UV4L driver with H264 encoding and still have very low latency. What is required to create a Mission Planner HUD?

    Thanks,

    Ron

    .

  • @Ronald,  

    The lower latency is due to the fact that you UV4L driver is using MJPEG and not H264.  The issue with this more than quadruples the bandwidth requirements.  If you are using WiFi to connect to your drone, and only want short-range, this will work fine, but the more bandwidth you are using, the greater the SNR (signal to noise ratio), you will need to keep the connection up.  

    Just wondering what kind of issue you have with using gstreamer?  I think the later PI images have it already installed. 

  • Hi Patrick,

    Not sure I can answer your question with confidence. We have a weekly drone coders. This week some kids presented a method largely described on Instructibles. Because so many of us have struggled with gstreamer, we were excited about the simplicity and robustness of this approach. I am including here the links we used:

    "I hope you all enjoyed today's Drone Coders Workshop. Streaming video from Raspberry Pi to your ground control system (via router) is a big step towards integrating streaming video into your drone.
    Paul, Anshul, and Vikram contributed to this project. Michael and Grey have added considerably.
    We hope to continue this with integration into the aircraft and application of image pattern recognition. 
    Following are links to the software we discussed, 
    Let us know if you experience any problems, difficulties, or surprising successes!"

     

  • @Ronald,  I don't have any experience with UV4L on the PI. Are you using H264 or ?? with this driver?

  • Help with integrating Raspberry Pi and the UV4L driver into mission planner would be much appreciated. It is much easier to setup than gstreamer and seems to have less latency.

  • Thanks Patrick, I can share .STL file with you or on thingiverse. 

  • @Alp, looks great.  Are you going to publish it for 3d printing? I just built a 3-d printed quad with the PI2/PI camera, but wanted it mounted to a gimbal, so I split the camera and mount.  Here's a link.

  • @Patrick

    I just finish prototype of my diy-hdlink

    Compute Unit: Rasberry PI 2 
    Camera: Rasberry PI Camera
    Wireless: 802.11ac (approx. 625mW)
    Power input: 7-26V

    3702125023?profile=original

This reply was deleted.