VR-Eye Spherical vision camera


This is a single lens spherical vision camera capable of streaming 360 x 240 degrees Field of View (FOV) video. It is optimized for drones and robotics due to its small size and weight.


The viewer can visualize the live or recorded stream as either a 3D georegistered video bubble placed on the map or enter the sphere for a more immersive experience. Additional viewing modes are: “Virtual pan-tilt”, “target lock-on” and “video draping” where the video Region of Interest (ROI) is rectified in real time. The VR-Eye cam software is also compatible with Android based VR headsets i.e. Gear-VR or other generic VR headsets that work with Android Smartphones i.e. Google cardboard. An iOS version is on the works.


The VR-Eye cam software can combine the spherical video with the MAVlink stream and offer a unique “geo-registered” view where the video sphere is placed on its exact location and orientation on the map.

The video below is a screen recording of a real-time stream as received by the ground station. The immersive video can be viewed live using the VR-Eye cam software or VR headsets and simultaneously recorded. The recorded video can be reproduced for post mission analysis where the viewer can select and monitor the camera surroundings from any desired camera angle combined with geolocation information.

For more information about the VR-Eye cam click here

The VR-Eye cam is expected to be in production early September 2016

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones


  • follow-up

    If you don't offer Z-depth maps, your spherical vision camera is not fit for obstacle avoidance technology.

  • @MAGnet Systems

    I mean 360 Panoramas


    since it's easy to turn smartphone into wide angle, fisheye camera


    with Photo Sphere, 360 Panos for Android


    what I mean is if your mathematicians have ever worked on generation of depth maps, 3D point clouds from

    spherical video.

    With more sophisticated, real time image/ video processing, you can turn your smartphone into 3D scanner (manually operated - moved by hand) today.

    The problem is to build such 3D scanner if you replace lens to wide angle or fisheye, since image distortions force your graphical processor to get busy to project one sphere to another sphere if you move ( change geolocation of your camera and tilt.

    Z-depth maps in 360 panoramas is not just another great idea, this is the high-tech challenge.

    You must be large to join such challenge.

    What you offer is oferred by others


    Indoor 360 panoramas by Google set the challenge


    what comes next are Z-depth maps in 360 panoramas
    and challenge is still open

  • @ GI, As described on the “why we made it” section of the product’s page, the VR-Eye cam is the first of its kind single lens camera that gives immersive 360 spherical view with extreme simplicity and low cost. This is a milestone on its own as it introduces several new possibilities on the way live video is experienced. The VR-Eye cam is intended for increased (panoramic) situational awareness. 

    The supporting software includes several other functions i.e. video frames real time geotagging, video draping, target lock-on, etc. but the Z-depth maps and point clouds are not part of it as it was not one of this product’s objectives. We daily receive several fresh ideas for additional features and some of them will be included in the first release this upcoming September as they are easy to implement. But we have to take a breath now and focus on the serialization of this product as we get a large number of mails from users that want it in production ASAP so that’s our imminent priority. 

  • @MAGnet Systems,

    I don't get your answer, so I assume you don't offer Z-depth spherical projection solution represented either as

    3D cloud point or on-the-fly generated true depth maps.

    The problem is really complicated since your fish-eye lens camera is projecting a sphere to flat rectangle on the plane of video matrix.

    Such projection is lossy ( 3D/2D sphere > flat plain rectangle)

    reversing 2D to 3D projection requires to calculate missing points on the sphere in

    1D line segment to 1D circle's arc

    Are your mathematicians ready to offer us  Z-depth map 2D sphere model (frankly speaking sphere is 2D object) ?

    I have experimented in the past with hollographic and optical computers to process live video and to extract Z-depth maps.

  • Great stuff here. Cannot wait.

  • Hello Glenn,

    Thank you for your nice comments. We have something in the works for software stabilization and I think we are very close.

    First, let me clarify that we don’t want the image stabilization to take place on the camera itself as we wish to keep the integration complexity, form factor and cost of the camera low. This would require additional processing boards, IMUs, etc. But we have found a work around so we can do it on the receiving end (GCS software) where all the processing is taking place anyway:

    As the VR-Eye cam video sphere is a “geo-oriented” object by nature (i.e. it has its own xyz axis and location attributes), we can feed it on the receiving side with the MAVlink pitch, roll and yaw data as the GCS software receives both the VR-Eye spherical video and MAVlink IMU data concurrently. This process will counteract any platform movements and stabilize the video. It just needs to happen really fast. This is how motorized gimbals work but in this case the process is purely digital.  We still have some issues to solve i.e. precise timing synchronization of video frames with the corresponding MAVlink IMU data but that’s the easy part. The only challenge I see is how to get the MAVlink IMU data streamed at a really fast update rate down to the GCS (i.e. ~ 100 Hz would be a good starting point) so its adequate for image stabilization. 

  • Congratulations guys this is pretty impressive! I can think of numerous applications. Software stabilisation would be great!
  • Hi Nikola,

    I just discussed this with the software developers and its certainly doable (and actually a very good suggestion) but our priorities until end of August have already been set to bring the entire functionality to Android so it can support the full range of Android smartphone (VR capable) devices. We are also developing a custom web based version that will facilitate very special needs over cellular networks with H.265 compression (same resolution with less bandwidth). But we have placed your suggestion in the queue for the following releases as it has certain advantages. I’ll keep you posted and you can ping me if you don’t get an update.

    I have an important announcement thought: We have just achieved the 5MP resolution @ 20FPS  for the video sphere. We were expecting this much further down the road but Javier and KKM (software and algorithm developers working behind the scenes) just nailed it. So expect very crisp HD spherical videos within the following release. 

  • This looks really interesting.  Are you planning to have an SDK available for those of us who roll our own mission planning software?  Translation: a Mac and Linux version.

  • @ Marc, Thank you for your nice comments. Although the Pharos antenna is an independent system from the VR-Eye cam, they both plug on the microhard modem as you pointed out and they both work together with the Pixhawk.The results in this case are extraordinary in terms of performance. Here are some facts:


    As we use a single modem and a single antenna for both telemetry and video on the drone side we reduced the weight and space used inside our drone.

    The fact that we removed the gimbal while retaining the 360 video capability saved us another ~250 gr (no video stabilization though).

    A combination of some very efficient motors with a 13 Ah, 6S Li-ion battery and the above light weighed set up gives us right now 60 minutes of flight time and 10+ Km of (video+telemetry) data link range. This means that the drone we currently test flies at 10+ Km, stays there for aprox. 35 minutes and RTLs while streaming 360 degrees video and telemetry during the whole flight.


    This is not a cheap set up as the modems alone are $630 each but we are confident that we stretched the envelope to its range and endurance limits combined. A 10+ Km range/60 mins quad is not easy to build but now it’s a fact. As soon as we complete our tests we will post images and more detailed information of this specific set up. 

This reply was deleted.