You need to be a member of diydrones to add comments!

Join diydrones


  • This has a $5k laser scanner on top, what is the kinect even used for in this robot? Just gestures? Why not just use the depth information?

  • I don't own a XBOX but might have to own a kinect.
  • Here is a link to a method based on 3d registration
  • 1.I think stitching will be something like this:You get a 3d scene which is relative to camera coordinate system.But you know the transform between camera coordinate system and your whole world scene(you control camera movement-with motors-like rotate 30 degrees to left).Just by applying camera transformation to your acquired 3d scene you obtain the current 3d scene view versus your world coordinate system.
    As for the position of the camera, maybe you can use DCM algorithm from IMU to get the position of your cameras.

    2.Also for stitching you may use software fusion(registration):one scene is randomly rotate until it matches another scene.
  • Bill,
    This is more than optical flow.Optical flow gives you 2D movement between 2 video camera frames.Usually you use optical flow if you have only one 2D camera.
    With 2 cameras like Kinect you get 3d views.The software implementation will be different.Each frame will be a 3d scene.You move the robot, you get another 3d scene.You control the camera movement so you know how to stitch different 3d views.You keep in memory your whole 3d scene and you can do collision detection between the walls and you desired path.
  • Now, if they'd just get the Neato XV-11's lidar hacked, they can ditch that super expensive Hyroku lidar on top ($1,200, ouch!)
  • i don't know if it is accurate enough for the readings, and over time error adds up.. i think :P
  • There's been many posts here on DIY optical flow sensors. Wouldn't that cover it?
  • to implement this on a indoor UAV, like a quad copter, we'd need a reliable odometry solution.
  • It was only a matter of time. We are already looking at buying a few at work.
This reply was deleted.