Oculus Rift DK2 with MavLink head tracking and HUD, first steps (OVRPilot part 1)

This is the first post in what I hope to be a series of posts on exploring the possibilities in using the Oculus Rift alongside a UAV. The plan in the long run is to create a GCS for the Oculus Rift or other HMDs that gives you a feeling of actually sitting up in the air being able to look around by just turning your head. Either as a pilot, copilot or passenger. Add to that the ability to control your UAV and interact with an augmented reality like maps for planning missions or guided mode and overlays giving information about your surrondings etc using for example a Sixense STEM or a Razer Hydra (Sixense TrueMotion) system. 

Anyway, here are the first trembling steps.

For some reason the embedded link doesent seem to work so here is a...

This is a really dodgy setup and the code right now is a mess. But here you can see my laptop connected to a PixHawk via mavlink running arducopter 3.2rc9 (yeah its a clone, but there wasn't any originals available in Sweden/europe when I needed one). Received MavLink data is displayed in the Oculus, though for now I only have GPS status, sats, hdop and attitude roll displayed in the Oculus so there is a lot of work to do. To know which way is straight ahead a green cross is displayed in oculus. The application then takes tracking data (roll, pitch, yaw) from the Oculus and assembles MavLink messages at a 50Hz loop that are being sent to the PixHawk to control the gimbal. The gimbal is controlled with an Alexmos 8bit SBGC. As you can see the gimbal is not very well tuned. The cables from the webcams are really stiff making it very hard to tune. Also a smaller gimbal with less inertia on the roll and yaw axis will probably be more suited to the task but this is what I had at home. 

Image is delivered from a Microsoft LifeCam studio webcam captured with direct show. Only one camera is used since my laptop refuses to capture two at the same time. Though on my workstation I can use both cameras along with the Oculus which is really cool. I can also use a Hauppahuge USB Live2 capture card to be able to use a analog camera with a wireless link. The image is really crappy though. My plan here has been to use two webcameras and a Odroid and stream a stereo video over 4G/LTE network using h264 low latency encoding. So far though I have not had much time to look into this. Maybe a DTV link using a Hides DVB-T dongle or a Aires Pro 5.8GHz link would better.

Right now I will continue to work on cleaning up the code and the HUD and hopefully I'll manage to do a test in the air soon.

Views: 3393

Comment by Gerard Toonstra on September 13, 2014 at 4:20pm

Hi Nils,

Some time ago I also looked at the Rift and ground stations, but instead projected a monocular camera image onto a 3D surface (actually, a pre-recorded video), so the objectives are somewhat different. I wanted to figure out how the experience would be if you remove the 2D screen that is hard to read in the sun and what the experience would be like if one maps the telemetry, maps and other controls on different surfaces to separate their use.

http://diydrones.com/profiles/blogs/virtual-ground-station-on-the-o...

That was all done with the UDK.

Comment by Brennan Zelener on September 13, 2014 at 6:25pm

This is spectacular progress!  Nice work.

Comment by Nils Högberg on September 14, 2014 at 3:04am
@Gerard
That's some really interesting thoughts Gerard. Im planning on having different surfaces that could be shown on demand containing maps, controls etc. It would also be really cool to draw 3D objects like safe fly paths or dangerous areas to augment the experience.
Though as you say interaction will be a big problem.

@Brennan
Thanks!
Comment by Gerard Toonstra on September 14, 2014 at 7:24am

Hi Nils,

This app uses a pipeline where it renders the video on a 2D surface (ortho view) in OpenGL and then renders other 3D components over it. When you do this, you need to make sure that the camera characteristics of OpenGL match those of the real camera (fov angles, aspect, etc).

As you can see, the telemetry received over wifi doesn't arrive at the same time, so you get some jumpiness on those 3D objects. You won't have those for elements that are not tied to, essentially, the IMU of the aircraft, like slide-in maps or plane controls.

It's also a good idea to render something pretty close, this helps the people to get an idea of the size what they are looking at and how far they are from the objects they're seeing. Otherwise, distances become a bit difficult to measure.

Comment by Gary McCray on September 14, 2014 at 3:15pm

Hi Nils,

Excellent to see such a great start.

Maybe bigger gimbal motors and more static gimbal motor current could allow faster response (less lag)?

Or is most lag a result of transmission, reception, display delays?

Not that lag is excessive, just possibly somewhat motion sickness producing (definitely familiar with that on DK2).

I also noticed that although horizontal pan/yaw closely followed DK2 head set, pitch appeared to not be scaled correctly and camera followed at noticeably less than head (DK2) tilt.

Planning on doing something similar with Alexmos 32 bit board and DK2.

Really informative to see your results so far.

Thank You,

Gary

Comment by Nils Högberg on September 15, 2014 at 12:06pm

Thanks!

I haven't put too much effort in tuning the gimbal yet so it might be that the motors doesn't have enough power or high enough I value or something. I'm going to try with a servo based gimbal with a analog camera and see how that works out latency vise for the head tracking. I have around 60 ms latency in the video using that setup. IIRC the MS camera has around 100-150 ms latency in good lighting conditions, in the above video the light wasn't very good so it was probably much higher.

The pitch axis behavior is probably also caused by the gimbal controller, maybe I haven't calibrated the accelerometer correctly. If I set the gimbal in neutral via mavlink and enter 90 or -90 degrees on the pitch axis it doesn't look straight up or straight down.

Pretty familiar with motion sickness, I almost feel ill just looking at my DK2 : ). Tried the Helix roller coaster?

Comment by John Githens on September 15, 2014 at 3:23pm

This is one cool project. After some internal debate, I cataloged this blog post on the DS 'Alternative remote control' page.

Comment by ChiggenWingz on September 15, 2014 at 5:47pm

I came across this program yesterday which takes video input and outputs it to the DK2
https://www.youtube.com/watch?v=vUeBBUfSGyw
https://developer.oculusvr.com/forums/viewtopic.php?f=28&t=11001

Highly recommend it, as it does a lot of cool maths in regards to the warping of the image.  Took my rover out for a quick spin with it last night and will be trying a few more things tonight on it all.

Keep up the good work I'll be watching keenly!

Comment by Gary McCray on September 19, 2014 at 6:18pm

+1 on that last comment Nils - I know the feeling.

Although I now have an adequate computer, I really need to update my video card into the GTX 770 range to have any chance of surviving the heavy duty stuff.

Lag and jitter and It takes hours to get over it, reminds me of when I was first learning to fly and while I was looking down my inside wing in a steep turn my instructor would tell me to clear the outside one - you have no idea - well actually you do.

Comment by zane on September 21, 2014 at 7:00am

http://www.emrlabs.com/index.php?pageid=3

Check out the transporter 3d by EMR LABS!

Comment

You need to be a member of DIY Drones to add comments!

Join DIY Drones

© 2019   Created by Chris Anderson.   Powered by

Badges  |  Report an Issue  |  Terms of Service