FPV + Oculus Rift Kickstarter

3689637201?profile=originalHey all, keen on Full HD, head tracking FPV with a wide FOV?

We are currently working on a project for a Raspberry Pi + Oculus Rift based FPV solution. Here are some of the advantages:

  • Full HD video feed
  • Head tracking built in from day one
  • Wide FOV on the high quality Oculus Rift display
  • No significant head tracking latency and target video streaming latency of 100-150ms
  • Plenty of room for hacking and scripting with an open hardware approach

We're live now on Kickstarter! Check us out, let us know what you think and maybe even back us :) .

https://www.kickstarter.com/projects/267372731/fly-like-a-bird/

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • Seems like a great project,

    Just a thought, the new Odroid C1 seems like it might be a lot better fit than the Rasp Pi and cost is similar.

    I like the expanded view being used to reduce lag, great idea.

    I know the implementation of 3D stereo vision is complicated, but stereo perspective can be manipulated optically and / or programatically to enhance stereo separation (depth perception) and there are other Oculus based FPV systems working on incorporating it.

    Best regards,

    Gary

  • @Jerry Giant

    Yes, the DJI LightBridge certainly is an interesting tx/rx. It does some things we don't and we do some things that it doesn't.

    I'm afraid I don't quite follow all of the points you raise, but here is my best attempt at some responses:

    • We are not planning to support 3D (see earlier post on that), but SBS stitching would certainly be doable - at least with a shader in VC4. The only question with that would be whether or not you can avoid an extra frame of latency with whatever you drop in the pipeline.
    • Gstreamer is not necessary for SBS stitching. Whether or not it would be the best solution is a different question, of course.
    • Yes, Wi-Fi regulations vary a fair bit between countries. This is why, to my understanding, Linux/the drivers include a list of channel/power setting for each relevant region.
    • I'm not sure what you are asking about the gimbal here. There is a 3-axis gimbal for head tracking included at the relevant support levels. If you weren't aware of this, then I suggest you check out the project page :) .
    • I'm not sure how electronics packaging comes into this?

    Yes, we have the confidence - hence the Kickstarter :) .

    @Dave Smith

    Wireless range is one of those tricky things and, of course, it will always depend particularly on your setup and your environment. Because so many factors flow into it, it is actually hard to find any definitive information on the range you can expect for given technologies or devices.

    Nevertheless, people appear to readily achieve beyond 100 metres range at decent speed, with "normal" WiFi AC setups, as long as it's line of sight and there is no interference on the same channel. This is what our number is based on. Of course, you can go way beyond that, if you are willing to tweak your setup a little. We have seen reports which suggest several hundred metres of useful range with high powered WiFi dongles and if you throw in a directional antenna, like a DIY helical, people routinely achieve several kilometres of range.

    Keep in mind that the "base station" end of the link is up to the users, although we will definitely provide recommendations on the hardware we find to perform well.

  • have a look at DJI's product for amateur "high def" video downlink, you will still see the sophistication of design. there are HDMI  converter, hardware encoder, and SDR like OFDM modem integrated on board, produced not a ideal result for FPV.

    RPi  doesn't need a dedicated high bandwidth video capture interface, and VC4 core for 1080p@30fps capable, but how to build a system spec based on these not likely to be feasible, here's a list of issues according to my understanding:

    • i hope to have a hardware logic for stitching raw SBS frames as a 3D video input.
    • GStreamer maybe the solution but it's a hack not a solution
    • Wi-Fi operating condition various in different countries, and a kernel driver for TX/RX may be needed.
    • servo gimbal with head tracking, or FPGA video chopping in real time, as a parrot bebop? none i assume
    • A+ or SODIMM packaging?

    if you really have the confidence to deliver what you've promised, it will be a very advanced work on a RPi, or i don't see the point of use a VR hud, and your product will be merely a hack on an ancient application processor.

  • @Dave Smith

    Cool, that's good to hear :) . Unfortunately, our solution will still require some form of computer on the receiving side. However, the ground station software does support Linux, so, as I mentioned in the other comment I just posted, running the ground station off a mini computer like another Raspberry Pi is certainly a possibility. However, we can't tell yet if the limited resources on the Pi itself might cause problems with that and we will be developing on laptop level hardware (at least for now). Nevertheless, I wouldn't see why that wouldn't be supported in a future version and it is definitely a neat option which we will keep in mind as we go!

  • @Matthias Badaire

    1: Yes, that is correct. We chose to go for 2D in our first Kit for several reasons. They include cost, complexity and doubtful benefit. We haven't had a chance to try a 3D FPV rig ourselves, unfortunately, but we are aware that stereoscopic depth perception only plays a strong role for things which are fairly close. E.g. in the official Oculus Rift World Demo (Tuscany) you can enable and disable stereoscopic rendering on the fly and the effect of stereoscopic is nice, but it turning it off really doesn't feel like you loose much and we find it adds vastly less to the feeling of immersion than just the head tracking does.

    Extending the Kit with a second camera would definitely be possible and software support for such an extension could definitely be added easily on the ground station side. The kit side would be more complex, though, so we wouldn't be doing this for the initial Kickstarter version, unless we substantially exceed our funding goal.

    2 We don't currently have access to a working DK1, but supporting it shouldn't be a problem. DK2 is what we are developing on.

    On having to use a laptop: We won't know whether this would just work until our software is finalised but running the ground station on another Raspberry Pi should be doable (though it may be a bit of extra work and thus be another future/more funding item). That way you could pretty much strap a battery to a Pi and put it in your pocket, with the Rift powered and fed from the Pi. Would this be a feature you would vote for, if we had some spare resources :) ?

    @Bjoern Kellermann

    Sorry, I'm not sure I quite follow all of your post (specifically about APM integration). I will try to give you a more thorough response after I get some sleep.

    The data link can certainly be used for additional applications (as long as they aren't super-bandwidth hungry, like another video stream). Streaming MAVLink along the link shouldn't be a problem.

    Yes, NAT is definitely an annoying issue and it is one of the main reasons why we chose to use WiFi as our primary solution. I have come across the problem of NAT traversal in one of my toy projects and some research I looked at, at the time, suggested that there is no single traversal technique that works for every situation and (if I remember correctly) even combining techniques only works for around a quarter of possible NAT situations (although I may have been looking for completely server independent solutions at the time, come to think of it). Of course you can always have a relay server, but we wouldn't want to rely on that for whole bunches worth of Full HD streams. If you happen to have some expertise or can point me to some definite information on NAT traversal solutions and reliability, that would be handy. I suspect my knowledge on that topic might be a little out of date by now and I haven't played with carrier grade NAT specifically.

    Downloading Stills in Flight: Shooting and downloading them shouldn't be a problem! (Phew, an easy one :D ) .

    All the communication channels you list shouldn't be a problem, since the WiFi (or other) link would be bidirectional and nothing there would require vast amounts of bandwidth compared to the video (as long as you are happy for the file download to take a few seconds when bandwidth is short). Point 6 is one of the features we specifically want to include in the "1.0 Kickstarter version". 3, 4 and 5 are possibilities for the future/in over-funding situation.

    Nice to hear you already have Pi's in the air :D .

  • 1 - so no 3d ? or I missed something 

    2 - DK1 compatible or only DK2 ?

    +1 for Superwalloon: I would prefer to be able to use my existing analog link with no laptop. A matter of choice, I suppose.

  • I would appreciate your option for mobile networks (3G,4G,LTE) and opening the link for Mavlink.

    With that solution you cound reuse the APM gimbal control and override for headtracking over Mavlink and integrate the video into Mission Planner HUD. Maby your solution can become part/Plugin of DroidPlanner, Andropilot, Mission Planner and APM Planner 2.

    Did you considered carrier grade NAT (cgNAT) for your 3G/4G/LTE?

    In Germany mobile connection are NATed.  In consequence you cannot open a port on the mobile network on the drone and connect from another computer. You can consider NAT punch through especially for double NAT situations: drone on mobile and notebook on mobile.

    If you considered mobile networkin, cgNAT and Mavlink I am happy to support development/testing and will pledge.

    I have a Raspberry Pi installed in my Skywalker X8 flying wing and the 3dr Y6 copter is waiting ;)

    In a later stage downloading still images from the Pi while flying would be amazing. Though full featured communication needs some channels:

    1. Video downstream

    2. Headtracking, could be Mavlink

    3. Mavlink upload for mission planning 

    4. Mavlink dowlink for telemetry

    5. Filetransfer downlink

    6. Configuration synchronisation between Pi on drone and groundstation, can advise drone to adjust video downlink (framerate, resolution)

    For more thoughts please PM.

    BR Bjoern

  • Ok, thanks, now we have something we can discuss :) .

    You might be surprised by how much latency is in common FPV setups today.

    For example, this video concludes that the Gopro hooked up through a traditional analogue vtx/vrx setup, has a latency of about 100ms and is actually quite good for flying a quadcopter. Other cameras in that test range up to 200ms and all of them are used for FPV flying. Our Kit already supports a low latency mode (at 640x480), of about 100ms, matching the Gopro in that test. If you have tried the Gopro setup and find it too laggy, then, indeed, this project is definitely not the right thing for you, but it appears that we are right in the range of FPV setups which are commonly used today :) .

  • anytime you add processing it will add lag. 150ms isn't acceptable in my opinion.

  • That is impressively unconstructive, Mike T. Can you phrase that as a convincing argument that can contradict the material we have provided?

This reply was deleted.