3689616046?profile=original

This is the first post in what I hope to be a series of posts on exploring the possibilities in using the Oculus Rift alongside a UAV. The plan in the long run is to create a GCS for the Oculus Rift or other HMDs that gives you a feeling of actually sitting up in the air being able to look around by just turning your head. Either as a pilot, copilot or passenger. Add to that the ability to control your UAV and interact with an augmented reality like maps for planning missions or guided mode and overlays giving information about your surrondings etc using for example a Sixense STEM or a Razer Hydra (Sixense TrueMotion) system. 

Anyway, here are the first trembling steps.

For some reason the embedded link doesent seem to work so here is a direct link

This is a really dodgy setup and the code right now is a mess. But here you can see my laptop connected to a PixHawk via mavlink running arducopter 3.2rc9 (yeah its a clone, but there wasn't any originals available in Sweden/europe when I needed one). Received MavLink data is displayed in the Oculus, though for now I only have GPS status, sats, hdop and attitude roll displayed in the Oculus so there is a lot of work to do. To know which way is straight ahead a green cross is displayed in oculus. The application then takes tracking data (roll, pitch, yaw) from the Oculus and assembles MavLink messages at a 50Hz loop that are being sent to the PixHawk to control the gimbal. The gimbal is controlled with an Alexmos 8bit SBGC. As you can see the gimbal is not very well tuned. The cables from the webcams are really stiff making it very hard to tune. Also a smaller gimbal with less inertia on the roll and yaw axis will probably be more suited to the task but this is what I had at home. 

Image is delivered from a Microsoft LifeCam studio webcam captured with direct show. Only one camera is used since my laptop refuses to capture two at the same time. Though on my workstation I can use both cameras along with the Oculus which is really cool. I can also use a Hauppahuge USB Live2 capture card to be able to use a analog camera with a wireless link. The image is really crappy though. My plan here has been to use two webcameras and a Odroid and stream a stereo video over 4G/LTE network using h264 low latency encoding. So far though I have not had much time to look into this. Maybe a DTV link using a Hides DVB-T dongle or a Aires Pro 5.8GHz link would better.

Right now I will continue to work on cleaning up the code and the HUD and hopefully I'll manage to do a test in the air soon.

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • @John
    Thanks!

    @ChiggenWings
    Haven't had the time to try it out yet. Read about it at the oculus vr forum though. Sounds interesting.

    @Gary
    I bought the 770GTX (Asus) when I got the rift. Its nice but it feels like it wouldn't hurt with a better card if you're going to play a lot of games. For demos and and just trying out the Rift its great.
  • Its been awfully quiet from me for too long now. I have not had very much time for this the last couple of weeks. Though that doesnt mean I havent done anythning.

    Been working on the HUD which is almost finished. It will be very like the Mission Planner HUD to start with. Actually it will be almost exactly since I'm using a lot of code from Mission Planners HUD. Will post a video of it soon.

    I've also built a test quad, Since Im a very inexperienced FPV flyer to say the least I thought it would be a shame to crash my Y6 with my gimbal for the firsts tests.

    3701842510?profile=originalThis is using a Fat Shark servo gimbal with a mobius and a 5.8 GHz video link. One great thing I noticed about the mobius is that it can give 1280x960 as a webcam. The oculus has a 1920x1080 screen for both eyes and 1920/2 is.....thats right 960. Whats not so great about is that its got almost 400 ms of latency it that mode. In 1280x720 it has around 300 ms. Anyway that doesnt matter here, when using the analog out it has around 100 ms latency.

    Unfortunately this new build has given me a lot of headaches. First of all it wouldnt update correctly to AC 3.2-RC10 or any 3.2 RC. Turned out I had an old px4io firmware that wouldnt update at boot. This was easily fixed by holding the safety button at boot but took me quite some time to figure out. Now I have some really great problems with the GPS. I was going to do the first flight tests with oculus this morning everything was setup and ready to go.

    3701842350?profile=original

    Though loiter didnt work at all. I had very few sats and hdop i bouncing around between around 1.8 and 3.0 all the time. Tried turning off the mobius and servos for the gimbal but it didnt get much better. Going to try out another GPS I have laying around and see if things improve.

     

  • http://www.emrlabs.com/index.php?pageid=3

    Check out the transporter 3d by EMR LABS!

  • +1 on that last comment Nils - I know the feeling.

    Although I now have an adequate computer, I really need to update my video card into the GTX 770 range to have any chance of surviving the heavy duty stuff.

    Lag and jitter and It takes hours to get over it, reminds me of when I was first learning to fly and while I was looking down my inside wing in a steep turn my instructor would tell me to clear the outside one - you have no idea - well actually you do.

  • I came across this program yesterday which takes video input and outputs it to the DK2
    https://www.youtube.com/watch?v=vUeBBUfSGyw
    https://developer.oculusvr.com/forums/viewtopic.php?f=28&t=11001

    Highly recommend it, as it does a lot of cool maths in regards to the warping of the image.  Took my rover out for a quick spin with it last night and will be trying a few more things tonight on it all.

    Keep up the good work I'll be watching keenly!

  • This is one cool project. After some internal debate, I cataloged this blog post on the DS 'Alternative remote control' page.

  • Thanks!

    I haven't put too much effort in tuning the gimbal yet so it might be that the motors doesn't have enough power or high enough I value or something. I'm going to try with a servo based gimbal with a analog camera and see how that works out latency vise for the head tracking. I have around 60 ms latency in the video using that setup. IIRC the MS camera has around 100-150 ms latency in good lighting conditions, in the above video the light wasn't very good so it was probably much higher.

    The pitch axis behavior is probably also caused by the gimbal controller, maybe I haven't calibrated the accelerometer correctly. If I set the gimbal in neutral via mavlink and enter 90 or -90 degrees on the pitch axis it doesn't look straight up or straight down.

    Pretty familiar with motion sickness, I almost feel ill just looking at my DK2 : ). Tried the Helix roller coaster?

  • Hi Nils,

    Excellent to see such a great start.

    Maybe bigger gimbal motors and more static gimbal motor current could allow faster response (less lag)?

    Or is most lag a result of transmission, reception, display delays?

    Not that lag is excessive, just possibly somewhat motion sickness producing (definitely familiar with that on DK2).

    I also noticed that although horizontal pan/yaw closely followed DK2 head set, pitch appeared to not be scaled correctly and camera followed at noticeably less than head (DK2) tilt.

    Planning on doing something similar with Alexmos 32 bit board and DK2.

    Really informative to see your results so far.

    Thank You,

    Gary

  • Hi Nils,

    This app uses a pipeline where it renders the video on a 2D surface (ortho view) in OpenGL and then renders other 3D components over it. When you do this, you need to make sure that the camera characteristics of OpenGL match those of the real camera (fov angles, aspect, etc).

    As you can see, the telemetry received over wifi doesn't arrive at the same time, so you get some jumpiness on those 3D objects. You won't have those for elements that are not tied to, essentially, the IMU of the aircraft, like slide-in maps or plane controls.

    It's also a good idea to render something pretty close, this helps the people to get an idea of the size what they are looking at and how far they are from the objects they're seeing. Otherwise, distances become a bit difficult to measure.

  • @Gerard
    That's some really interesting thoughts Gerard. Im planning on having different surfaces that could be shown on demand containing maps, controls etc. It would also be really cool to draw 3D objects like safe fly paths or dangerous areas to augment the experience.
    Though as you say interaction will be a big problem.

    @Brennan
    Thanks!
This reply was deleted.