3689616046?profile=original

This is the first post in what I hope to be a series of posts on exploring the possibilities in using the Oculus Rift alongside a UAV. The plan in the long run is to create a GCS for the Oculus Rift or other HMDs that gives you a feeling of actually sitting up in the air being able to look around by just turning your head. Either as a pilot, copilot or passenger. Add to that the ability to control your UAV and interact with an augmented reality like maps for planning missions or guided mode and overlays giving information about your surrondings etc using for example a Sixense STEM or a Razer Hydra (Sixense TrueMotion) system. 

Anyway, here are the first trembling steps.

For some reason the embedded link doesent seem to work so here is a direct link

This is a really dodgy setup and the code right now is a mess. But here you can see my laptop connected to a PixHawk via mavlink running arducopter 3.2rc9 (yeah its a clone, but there wasn't any originals available in Sweden/europe when I needed one). Received MavLink data is displayed in the Oculus, though for now I only have GPS status, sats, hdop and attitude roll displayed in the Oculus so there is a lot of work to do. To know which way is straight ahead a green cross is displayed in oculus. The application then takes tracking data (roll, pitch, yaw) from the Oculus and assembles MavLink messages at a 50Hz loop that are being sent to the PixHawk to control the gimbal. The gimbal is controlled with an Alexmos 8bit SBGC. As you can see the gimbal is not very well tuned. The cables from the webcams are really stiff making it very hard to tune. Also a smaller gimbal with less inertia on the roll and yaw axis will probably be more suited to the task but this is what I had at home. 

Image is delivered from a Microsoft LifeCam studio webcam captured with direct show. Only one camera is used since my laptop refuses to capture two at the same time. Though on my workstation I can use both cameras along with the Oculus which is really cool. I can also use a Hauppahuge USB Live2 capture card to be able to use a analog camera with a wireless link. The image is really crappy though. My plan here has been to use two webcameras and a Odroid and stream a stereo video over 4G/LTE network using h264 low latency encoding. So far though I have not had much time to look into this. Maybe a DTV link using a Hides DVB-T dongle or a Aires Pro 5.8GHz link would better.

Right now I will continue to work on cleaning up the code and the HUD and hopefully I'll manage to do a test in the air soon.

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • This is spectacular progress!  Nice work.

  • Hi Nils,

    Some time ago I also looked at the Rift and ground stations, but instead projected a monocular camera image onto a 3D surface (actually, a pre-recorded video), so the objectives are somewhat different. I wanted to figure out how the experience would be if you remove the 2D screen that is hard to read in the sun and what the experience would be like if one maps the telemetry, maps and other controls on different surfaces to separate their use.

    http://diydrones.com/profiles/blogs/virtual-ground-station-on-the-o...

    That was all done with the UDK.

This reply was deleted.