The Exo360 brings together the latest in drone and VR technology for the ultimate immersive experience. With 5 integrated 4K cameras, the Exo360 offers an uninterrupted 360 video for use with your VR headsets. The footage can be viewed in real-time or uploaded to YouTube or Facebook360 upon landing.

The drone can capture video in both linear and spherical modes, and 16MP images. With the use of our 360 video capture you will never need a gimbal again!

To learn more please go to Queen B Robotics
E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones


  • Even if the GPU wasn't an issue you've got four image sensors that jut out past the arms with their lenses fixed at the tips.  Hope they are planning on incorporating an easy way for the user to replace them, cause I really don't see them holding up to general consumer use.  

  • Developer

    With the fixed cameras focused at infinity the 360 merging becomes a relatively simple static process.

    1. Remove camera input lens distortion

    2. Stabilize video

    3. Stitch videos into 360 panorama, no need to analyze image content if cameras are fixed in the frame and focused at infinity. You only need to apply stabilization offsets and blend.

    4. Encode 360 video

    Except for 4 (uses dedicated encoder hardware instead), these are the kind of tasks that the GPU architecture excels at.

  • I have thought, what if you have the, "raw data," as in video an IMU data saved for export and later post-processing. It could be stabilized and either distributed as 360 video, or edited down to normal, "2d," video with the the person directing the shoot having access to essentially an unlimited view source video, with the ability to make choices after the fact as to what shot they want. This would eliminate the need for heavy on-board processing of video, and could make the package less resource intense.
  • Hi Rob,

    The stitching is probably not so much of a deal, but I am going to go out on a limb here and say yes I actually think it probably could.

    Its 256 GPU multicore archtecture is perfect for this sort of a problem.

    There is one caveat however, latency.

    You probably can't buy anything remotely close to this capability that you could either afford or stuff into a small quadcopter chassis.

    And it's architecture really is perfect for this sort of problem.

    But there will be some latency and you would probably want a 2 tier approach FPV mode and video recording mode, probably overlaid on top of each other.

    The FPV video would not need to be "perfect", just good enough while the video recorded would need to be as good as you could make it.

    I am thinking the FPV video could be extracted earlier in the process than the record mode video.

    Still I would not bet my life on it and I think 1080 x 5 would be a lot easier to pull off than 4K x 5.

    Also as you know, you need an oversized image from which to extract the "stabilized" video, so 1080 and 4K are really both rather more than that.

    And although pitch and yaw are simple vertical or horizontal displacement of pixels and easy to compensate for, roll requires a rotary (around a variable center point) axial correction and is a non-trivial solution which is still bedeviling the BeBop 2 and to a somewhat lesser degree, perhaps even the latest Sony action cam.

    Doing a convincing job of correcting roll is the big issue for all digital stabilization and it is processing intense.

    Probably a good idea to prove it with 1 or maybe 2 cameras first, that would make me happy anyway.

    The 360 view is a cool capability, but adequately solving it with a digitally stabilized system is what will sink this project if anything will.

    The most significant features of the TX1 are a. it is really small and b. it only draws 10 watts, nothing else even comes close.

    Of course if the TX1 wouldn't work maybe the new GTX1080 board with 2560 Cuda GPU cores in their new Pascal architecture (of course the 180 watts and need for a PC backplane might cause second thoughts.) On the plus side you can run 4 of them at once.

    Hopefully they won't need the other Nvidia super computer - the DGX1, It would be a little heavy and pricey anyway. :)



  • Gary, could a TX1 actually accomplish really high quality 5-camera HD video stabilization and stitching?

    And then, besides the question of electronics, we get back to the reality of glass.  It's all about the glass.

  • The VR drone mentioned by NiMA_Asghari was probably the SPHERIE.


  • It seems like sticking camera lenses out on stalks, as first points of contact, is probably not a good idea. Evolutionary biology chose not to put eyeballs on a lifeforms hands and feet for good reason. I also agree with the majority of folks that posted above, the video looks like concept video vs. a completed design displaying it's true capabilities. @Kibir +1

  • Developer
    Hi Chris, as Gary and Rob asked, could you please clarify whether the footage you show in the video come from your actual vehicle?
    It seems to me that the clips are not. Firstly the distinct lack of radial distortion from cameras so wide, and the smooth panning, etc. make it look like it's shot on a gimbaled camera ship.
    Reminds me of the Lily promotional video...
  • Small company designing/using a monocoque hull, that's the kind of advancement I like to see.

  • Sounds great Chris,

    If you can really pull that off, it should be an excellent system.

    My guess is serious real time video processing power is required on the order of a significant multi GPU based system like the Nvidia TX1 to accomplish that on board.

    I will be very interested to see what you come up with.

This reply was deleted.