I have worked hard in my sparse time to improve OpenFPV. I got a lot of mails from interested people around the world and can't wait to share the latest results with you. Your support is amazing, and my goal is to bring this project alive as soon as I can. Special thanks for all who donated a few bucks to the Project, every dollar helps me a lot.
A quick overview from the last month:
- MinIMU-9 v2 Gyro, Accelerometer, and Compass test integration
- Further latency optimisation
- 160° Wide Angle Fish Eye Camera
- 5 GHz transmission tests
- Oculus Rift DK2 Support (experimental)
- 720p at 40 FPS Tests
- Source code re-structure
- Optimised video rendering
Pro:
- The gyro integration works pretty good so far
- Telemetry overlay works well with the gyro
- The gyro data transmission works even if the video feed is broken (separated data channels)
- The latency is is now stable over hours and pretty promising
- The wide-angle camera does the job pretty good, nothing else to say
- The new 5 GHz (diversity) adapter gave me really good results and is affordable (20 € each)
- The Oculus Rift was mind blowing - The image is that big that I'm not able to describe
- The source code is now better structured and modularised
- 720p (1280x720) works good on the ground, not tested in the air
Con:
- The CPU Load is to high on the ground station (20% / 400%, 640x480 H264, 2.3 GHz I7, MacOS)
- No hardware decoding for the video stream
- No Telemetry data for the oculus at the moment
- No Fragment shader for Oculus Rift distortion
Bad weather in Germany
The Roadmap for 2015:
- Between January and February I will release a first access to the project. This version should be stable enough for testing.
- Between May and June I will release a updated version with improvements from the community
- I target August 2015 as a first "release*".
* what means release:
- Easy to install
- Documentation
- Download a Ready To Fly image for the raspberry
- Maybe sell a complete setup online (amazon)
(Please note that all dates are in subject to change, depending on issues, bugs, time)
Some pictures from the lab:
Me holding a prototype with 5Ghz diversity, 160° fish eye camera. I can even write messages on my phone trough the oculus live feed with this setup. Can't show you how it looks like, but all who tested it in my lab are amazed.
Screenshot from the oculus mode. *Note the missing fragment shader for the rift distortion, this is not 3D (but possible in the future, you will see a huge 2D screen in front of you)
The prototype with attached 5Ghz adapter watching birds.
I have no in flight footage yet, the main reason is the bad weather here. But I will do some recordings, pictures if the weather is a bit better and I have enough time and a working drone / plane*, you have my word.
* My last plane crashed and destroyed one OpenFPV Setup while testing
TL;DR;
- First release for developers between January and February 2015
- First "consumer" release summer 2015 (maybe later)
If you have any questions, please leave a comment below or write me: contact@openfpv.org
Regards,
Tilman
Comments
Where's the code? This project is called OpenFPV but i find nothing https://github.com/openfpv (actually all links to 'fork on github' on the github.io site return to this post on diydrones.
I think really if asking for donations and having a project being described as an 'open source project', code should be 'published early and often'.
For example here's a more open FPV project http://diydrones.com/profiles/blogs/fpv-setup-with-raspberry-pi than "Open FPV"
Hi Tilman, thanks for infos. Hope we can find a way to get rid of the flimsy pi camera :-)
also, have you considered two raspberrypi's - one with camera, one with monitor?
Thanks for the info. OSD data will come from wifi also?
Thanks @benbojangles ,Basically it is a single-board computer (like the RaspberryPi) with a camera attached. This sends the signal down to the ground with standard wifi or HSDPA. On the ground you need currently a laptop yes. What it makes different from other projects is that you can easily add / remove on screen data with standard HTML5 and a bit of JS if you like.
I try to get the latency as low as it is possible by optimising all settings (encoder, decoder, framerate, resolution, etc.) but It should be possible to fly a drone with the current results (latency around 120ms). The wifi range depends on the antennas used, the frequency (2.4 or 5 GHz) the Tx Rx power and the environment. I got video feeds from 3000 meters line of sight but around 500 meters seems more realistic. Did this info helped you?
Hello, please can you tell me, this is basically mounting an IPcamera/wifi camera to a vehicle and sending the signal to a laptop that has an Occulus Rift headset connected?
I'm a little bit confused.
How do you plan on solving the latency problem? And how much signal range do you hope to acheive (is it standard wifi range?)
Thanks and good luck!
@bocorps I do not know how much it draws but I try to figure out. The Rocket M5 looks pretty robust and professional, not a bad decision. The cheaper hardware always feels cheap, even the CLS one. As mentioned in a comment before the CLS is a bit wobbly in my USB port (mac book pro) and feels not as good as my alfa 2.4 GHz adapter, anyways it works well with a short usb cable. I only use UDP protocols, because the most data don't have to be reliable. But I plan to create an architecture to use TCP as well. In OpenFPV you will have a place where you can put a "module" on Tx and Rx side. There you can request a connection which will be negotiated between Tx and Rx. I still work on the software architecture how to do right. Currently it supports python, maybe others will follow ... but I am digressing :), thanks for the questions.
@Caesar Winebrenner I consider other VR headsets but the SamsungVR (wich afaik is done together with Oculus) depends on a Galaxy Phone / Phablet. Here I ran into a problem, the current environment don't supports Phone and other mobile devices due to it's combination of video feed(s) and html. It will be possible to get the feed working on phones, but overlays (TLMY etc.) as they build in OpenFPV will be a serious problem. I hope to overcome this, but currently the target device is a Notebook / Netbook not mobile devices. There are other solutions (I think pod is his username which merge tlmy data into the stream, this approach is much better for this case. All other VR headsets, if they work in a similar like the oculus, should work with OpenFPV.
To all who wanna know why supporting VR headsets is not an easy task I want to give a short overview what the problem is. Number one is chromatic aberration. The lenses of VR headsets are shifting the color from the center of each eye. To eliminate this effect OpenFPV is using shaders to correct this effect but they are not done yet completely and the chromatic aberration differ from every camera, image size, vr headset and so on.
Number two is the incredible huge video size from the oculus. It is so big that you get dizziness from it at the beginning. There should be options to adjust the image size to fit the users needs. I work really hard to get this stuff solved.
please be sure to consider the Samsung VR as a supported device. Oculus will have a much smaller client footprint initially.
One more question , do you use TCP/IP or UDP protocole ?
i didn't see if you use gstreamer to pipe data.
Hi Tilman, you must be right, i just read that this adapter is MIMO. That CLS looks way better than Alfa AC1200. Do you know how much Amps your adapter draw ?
I am actually on a similar project but i bought a Ubiquity Rocket M5 as air adapter. Possibility to get telemetry from Pixhawk appealed to me. Now, maybe i will have to try your setting to get rid of all those heavy RJ45 connectors :-)