Not strictly Drone related, but if the weight and power requirements are manageable, the new Kinect sensor from Microsoft could be pretty amazing as a navigation/obstacle avoidance/3D mapping tool. And it can see in the dark and read your pulse rate.

From Engadget:

The new Kinect for Windows sensor won't be available publicly until some point in 2014, but developers can apply for an early, $400 development kit starting right now (due before July 31st at 9AM PT), Microsoft announced today. In that $400, developers (if accepted) will get early SDK access, a pre-release "alpha" version of the device, a final retail version (at launch), and private access to both APIs and the Kinect for Windows engineering team (in private forums and webcasts). Should you get in, you'll find out more come this August.

Some info on the new Kinect are at this page: Unofficial Kinect Info Page


I have to think a bunch of labs are going to jump on this.

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones


  • Hi all,

    The PX4 already has had some work done to integrate a Kinect here: https://github.com/soonhooi/pixhawk-kinect-pkg

    One of the more important aspects of the Kinect is how much preprocessing is built in, reducing the load on the target processor.

    I am building a hanging pendulum robot using 2  brushless electric bicycle wheels.

    I will be using a Kinect on a stabilizing platform as the primary sensor on it.

    It is my current intention that the Kinect will initially be hooked up to my laptop (on board) but a PX4 will be providing interface, sensor and control functions.

    The reason I am initially going through a Laptop is that Microsofts Robotics development library provides some substantial support for both the Kinect and for Laser Scanners and provides lots of processing power and memory resources.

    Right now I am working my way through controller issues and this is not going to be a quick project.

    Hopefully, by the time I get it working properly with the "old" Kinect the new one will be available.

    Full sun is a pretty much impossible issue for the current Kinect and it is not inherently better for a time of flight system.

    The limiting factor is the power of the lasers / leds they can reasonably (safely) use for illumination and the fact that the IR lasers radiate at a frequency where sunlight is also bright.

    This is also a problem with laser scanners, and most of the "safe" ones have quite limited range outdoors in full sunlight.

    ROS package containing PIXHAWK software for use with Kinect sensor. - soonhooi/pixhawk-kinect-pkg
  • The demo I got showed off both low light and bright light settings, but I agree full sun might be an issue. Range also, but for the sort of things I think it would be useful for, range isn't that big a deal. I am thinking internal obstacle avoidance, spatial mapping, etc... The combination of closer in depth date with video should make very interesting things possible - especially on a powerful processor.

  • Developer

    I actually tried to connect the original kinect up to the APM but the APM couldn't act as a USB host.  I then tried to use the Sparkfun USB Host Shield but found it was not up to the task for some reason I now forget...I think there was a software issue in which some features required to talk with the kinect were missing.  I also realised while doing this that the amount of data passing from the kinect to the APM were going to be huge and we'd need some ARM based processing in-between that would basically interpret and compress the data down into just what the autopilot needed.

    Anyway, I'm hoping we get back to trying this again some day, more likely with the PX4 than the APM2.

    SparkFun USB Host Shield
    This new version corrects the pin out for the GPX and RESET pins. The SparkFun USB Host Shield contains all of the digital logic and analog circuitry…
  • Developer

    The new sensor is reported to use Time-of-Flight instead of structured-light, so that is a major improvement. But the sensor range is still a problem for our usage.

    Resolvable Depth: 0.8 m -> 4.0 m

    I also suspect that direct sunlight in outdoor scenarios will still be a problem even with ToF sensing.

  • As far as I know, kinect uses structured IR light to measure depth and construct 3D data... it even has trouble in brightly lit room, so I think it is still some way off to be useful in sunlit outdoor environment flying tens of meters above the ground... 

  • This is the camera sentry I referred to which I believe is closed source.

    This is what I think may be the best of the open source sentry solutions. Though I haven't looked at this in a while. They also sell a controller board.

  • I like your thinking. I bet this device could be de-cased and made light engough to fly. I can't wait to see what some bleeding edge people can do with this. I've been mulling over what could be done with the open source code that is out there to run paintball sentry guns. In one iteration of this they had two cameras mounted on a gimbal, one a low res webcam and the other an HD cam. The software runs motion tracking/following code with the webcam as it takes less compute and moves the gimbal to follow the motion based on the programmed parameters, speed, size, color, etc. The HD cam follows along and is used to record or view footage for security or whatever. The problem I found with making a solution like this fly is that the software is based on an Intel graphics library that AFAIK only runs on windows, which makes putting it in the air quite difficult. I think something like this could be used to control an APM and actually control the aircraft to follow a moving object autonomously. Sorry for the long-winded post, not intending to threadjack I just get a bit over excited by this aspect of the hobby.

This reply was deleted.