Hey guys,

a long time passed since the last update from OpenFPV.

Thank's to the community for the great feedback and special thanks to all people around the world who wrote me feedback mails, joined the project or donated money. I really appreciate your interest about the project.

It was very silent the last month about the project, the reason is very simple:

I do not want to ship crappy unusable software.

My vision for OpenFPV:

OpenFPV should be a simple intuitive application for all desktop platforms. Easily extendable, stable, well designed. And easily extendable was the biggest challenge. Do you want beat around with C++? Do you want build your GUI with Qt?

Imagine you can create a dedicated data channel on TX side with python and receive the data on RX side with simple JavaScript. Layout your UI with HTML5 Elements. If you want to go deeper, why not edit the GStreamer pipe by your own. Without recompiling the whole application. Create your telemetry module in no time.

The trouble:

It was insanely difficult to get the video played back inside the webkit browser engine without adding delay to it. It is not possible to do it with HTML Video, Flash, WebRTC. The biggest challenges are buffer settings. Event Flash cannot display the video fast enough (and this is the most stable technology for live streams inside a browser) for serious FPV flying. My OpenFPV workspace directory is full with prototypes, ideas, and more prototypes. After month of fighting with myself and my vision I came to a deadly simple idea. Convert the H264 stream to a format like MPEG-1 and decode it with JS.

Maybe you think: "What the ... ?"

Don't worry, that was my first though too. I started to write a MPEG-1 decoder with JS until I stumbled upon a unpopular project on github which is doing exactly this job. After integrating all the GStreamer pipelines, transcoding, sockets and web sockets and this library, I displayed the fist output inside the browser. Without high CPU load and a good quality. And the best thing: Without a noteworthy additional lag. Sometimes the simple Ideas are more difficult then the complex ones.

Demo:

I created from this Iron Bird the first prototype. And I decided to give you guys a little update about the project. I am excited about your thoughts and ideas.

- Tilman

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • 2.4Ghz wireless does not interfere with 2.4GHz radio control.  I just set the WiFi to channel 13 which is at the upper end of the 2.4G band.  The radio control works from the lower end and has heaps of band to hop around under this.

    We have flown heaps with 1000mW 2.4Ghz WiFi and a stock Taranis on 2.4 with no problems at all.

  • Fantastic work, thanks for the update.

    I am moving RC controls and telemetry onto the 433Mhz bandwith atm, to free up 2.4ghz for data transmission for video, (guide on http://www.itluxembourg.lu/site/). Then a strong 2.4ghz transmitter can be hooked up to a raspberry pi / camera (http://www.amazon.com/Alfa-AWUS036NHR-High-Gain-Omni-Directional-Wi...).

    Great thinking out of the box. Love how you used javascript to allow it to be used on any device.

    Varonis
  • Are you Ali G? :)

  • Something like the PI will be the next evolution in Flightcontrollers. I believe the next big thing will be something like a PI, with Camera, OSD and other extras today being in the one bundle. This is and the other raspberry fly project are par of the next big things.

  • Developer

    Tilman,

    I used opencv called from within python to capture the image.  I think it captures the image in H264 format (this is the reason I chose this camera) but I never specifically checked.

    Lag was an issue originally so to measure it I created a python/opencv script that would print the system time on the console and simultaneously on the latest captured image from the camera.  I then pointed the camera at the computer screen which displayed the console.  In this way the image showed both times on it and I could just subtract one from the other to get the lag.  In this way I was able to measure that the lag was no more than 0.1 seconds (100ms).  This was good enough for what I was doing but perhaps not fast enough for FPV.  Unfortunately I didn't save the script so I can't easily share it but hopefully you get the idea.

    By the way, here's a video from the onboard video.  I only ran the updates at 5hz so it appears very sped up.

    To be honest, I'm a little worried that the RPi you're using isn't fast enough.  The Odroid is about 20x faster if you include both the faster CPU and that it's got 4 cores compared to just 1 on the RPi.  Anyway, it's early days so you can perhaps cross that bridge when you come to it.

    rmackay9/ardupilot-balloon-finder
    Code meant to be run on an Odroid to allow an ArduCopter Pixhawk based multicopter to find red balloons for Sparkfun's AVC 2014 competition - rmackay…
  • Thanks Randy! The problem with 2.4 GHz is that is the same frequency then my remote control. They interfere and I want to avoid that. A problem with 5 Ghz is maybe the range compared to 2.4 Ghz.

    Thanks for the C920 tip, do you used the internal H264 encoder?

  • Developer

    Tilman,

    I'm interested in this next step of how you'll get the video down to the ground station.  So it sounds like you're going for a Wi-Fi connection?  I'm sure is the right choice although it probably won't let you fly as far as a traditional fpv system.  I guess you've considered the 5Ghz vs 2.4Ghz thing?  I'm not expert in this area, but just wondering.

    Re the C920 camera, this is what I used for the red-balloon-popper, and I found that the camera was very laggy unless you read from it at at least 15hz.

    Best of luck!

  • Thank you guys for the feedback, I'm looking forward to give you more updates in the next month.

    @Gerard The video is transmitted by raspberry pi and its hardware H264 encoder. The receiver (mac book pro at the moment) h264 decoding is done by the avlib h264 sw decoder. I try to get it decoded with hardware acceleration too, but it depends on the libraries for osx / win / linux. Maybe it is possible to run the RX side on a second raspberry and use its HW h264 decoder.

    @Tearig My current focus is on the raw data transmission via UDP sockets and display the data with UI elements. To render the telemetry stream over the video stream will require some alpha channels / keying. In general that should be possible, but it requires a modification of the decoder on the playback side. 

    Some additional stuff to the post:

    The next steps are 5Ghz networking (depends on available hardware), add data channels and a HUD module for the MinIMU-9 and test the Logitec C920 camera inside the setup.

  • Great progress, any thoughts on sending the TX data signal encoded and digitally, this then be decoded at the base station, the video stream would then be rendered and the telemetry parallel streamed data could then be picked up and rendered overed the video signal...   This is something that really could make FPV better, looking forward to your updates

  • latency in video is biggest issue, great job!

This reply was deleted.