Lidar SLAM for $150


3689677012?profile=originalSince posting the BreezySLAM package for Python and other languages, I've received several inquiries about supporting the Lidar unit on the XV-11 vacuuming robot from Neato Robotics.  Although I have an XV-11 and a Python package for it as well, I was reluctant to pull the Lidar unit from it to put on an MAV or other robot.  Now, thanks to GetSureal.com, I was able to get the XV Lidar with a handy Teensy USB adapter built in (which makes clever use of the popular Teensy Board).  So I wrote a little acquisition module for the XV Lidar, and added SLAM support for it in BreezySLAM.  

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • LOL! I'm a big fan of Will Smith. I've got "one of these" on my desk right now. More information will be posted soon.

    3702075361?profile=original

  • @LaserDeveloper: What can I say but ... https://www.youtube.com/watch?v=g2O7rZTBs7w

  • Hey Simon don't let benbojangles get you down ;). I think you've done the right thing for the right reason and most people don't realize that something with the limited processing performance of the Teensy can actually do SLAM and advanced navigation. Just one thing. You might have started with a slightly more capable LiDAR - one with built-in navigation tools like the one that created the image below. The scale is in meters and the overlays show the results of navigation decisions made in less than 300ms as follows:

    1. The safety alarm zone is clear - red circle 2m in diameter that stops the drone if it gets too close to people
    2. The forward alarm warns of an approaching obstacle - yellow triangle looking 7m ahead
    3. In response to the forward alarm, the flight controller sends a request to check that turning right 45 degrees will be a safe course change - green triangle
    4. The LiDAR replies that it found an obstacle 17m away so maybe the controller should consider a different course of action
    5. The controller asks the LiDAR to find a safe corridor to the right
    6. The LiDAR replies that it can go right for 45m before hitting something
    7. The controller wants to get further away so it asks the LiDAR for a safe corridor to the left
    8. The LiDAR replies that it can go left for 65m
    9. The controller changes course to the angle specified by the LiDAR
    10. Repeat.

    IMHO, this kind of intelligent interaction between a non-real-time controller and a smart sensor is the way to handle complex mission decisions. So maybe we will see your Teensy making some smart moves after all :)

    3702158230?profile=original

  • In my dreams! ;^)  No, Teensy is being used to send the scan data from the Lidar over USB.  This was GetSurreal's innovation.

  • Can I just get this right, a *TEENSY* board is doing S.L.A.M calculations?

  • Does this do object avoidance well?

This reply was deleted.