3D Robotics

### Using LIDAR-Lite to sweep an arc for sense-and-avoid

I was curious how well LIDAR-Lite, which is just a laser range finder out of the box, would work as a full sweeping LIDAR unit, so I set up this demo unit. The unit updates at 100Hz, so to detect a 10cm object (like a telephone pole) within a 5m arc at 10m distance (with 2x oversampling), I calculate that you need to sweep 30 degrees back and forth each second. (5m = 10m*sin (30)).

That's totally within the speed of a regular servo, so I threw together this test. It just uses APM as an Arduino, sends the LIDAR-Lite and servo position data over serial and reads and graphs the data with a Processing sketch on the laptop.  The Arduino code and Processing sketch are here: LIDAR%20sweep.zip

BTW, the LIDAR-Lite sensor is already fully supported by the APM code as a range finder (for altitude hold with copters and/or autolanding assist with planes). You can read more about using it here

The code for object avoidance using LIDAR-Lite is already written (thanks to Robert Lefebvre), and this would just add a sweeping component. Here's a video of it working in static mode, on a 2-axis stabilized gimbal:

Some observations:

• It works! I can spot telephone poles with no problem
• That said, the effective range of the LIDAR-Lite unit in this application is just 10m. I'm not using a low-pass filter on the cable, which is normally recommended, and I'm just taking one data point at each position, so I think that can be improved with a smarter sampling strategy.
• In practice, this would be better implemented by moving a mirror, not the entire LIDAR unit, to avoid shaking and other off-axis movement that can get in the way of sampling
• Laser range finders are SO MUCH BETTER than sonar.

• @Coastwise, that's a popular misconception. Calculating motion vectors is actually quite trivial. The trick is recognizing that it is only the relative motion that is important, not the absolute motion of the two aircraft.

Since data is collected from one of the aircraft, motion is measured relative to this frame of reference. Calculating the relative motion vector of the other craft is a matter of taking a number of readings and watching the movement.

In the plot below, each set of LiDAR data is subtracted from the previous one. What is left is the difference between the data sets plus a small amount of random noise. The "vapor trail" of the moving object can clearly be seen and calculating the relative vector of motion involves a conventional "best fit" calculation. This trajectory can then be extrapolated to find the point of closest approach.

• I don't have experience with servos. Obviously, I'd like to use the lightest possible. How much torque is necessary to oscillate a LidarLite? There's a very light HiTec that produces 25 in/oz. Is that sufficient? More power comes at the cost of tripling the weight.
• When mounting the Lidar, do you have it facing straight forward or straight down?

• When mounting the Lidar, do you have it facing straight forward or straight down?

• Great work. I want to bring attention to my students' work also on UAV sense and avoid:

A technical report explaining the design: http://ucdart.github.io/files/master-plan-ii-project-report-minjie-...

The problem with lidar sensors in general is that their beam width is too small so it is easy to miss a target such as a tree. Like Sam suggested, a small radar sensor would be better and we have indeed been working on it.

Here is a student report on the design of a small 24GHz radar sensor:

http://ucdart.github.io/education/files/eec134-2014-2015/Team_Stefa...

There is still work to be done to improve its performance but so far it looks promising!

• I think there is a lot of potential for small-scale radar. It just seems to fit.

My expertise in vision-based systems is as bad as it is in radar, but Raviv Raz is doubtless in better position.

There is a discussion he started here that suggests there is some potential.

Back in 1998 I went to a UAV conference in London and "Sense and Avoid" was one of the major topics. I'm a mechanical engineer so it's largely out of my comfort zone but it seems it's still a major hurdle for the industry.

Sam

Here's an article about it --> http://www.microwavejournal.com/articles/21161-ghz-fmcw-multi-chann...

It is very expensive the moment, but I'm hoping that the price will come down in the near future.

• Mark,

Neato was developed for indoor use really.  You wouldn't be able to get reliable results outdoors with a UAV.  The system itself is very simple, there would be no problems getting a system suitable for outdoors to use essentially the same mechanism as the XV11 LIDAR unit, except with a suitable laser unit (which we have available now thanks to lightware and pulsedlight3d.)  The next step is integration.  I believe all the tools are available to us and it's a case of integrating the best combination of hardware and software to get the best result - aiming chiefly at a highly compact low weight system which can run fast enough to do useful things with the data streams produced by the system.

I think one area overlooked right now is how best to integrate these kinds of devices on to a quadcopter.  Ideally, you need a clear hemispherical field of view, and it seems likely that these kind of devices will work well when placed right in front of where we like to have camera gimbals!  It may well be the case that you can't effectively achieve 360 degree views with one unit.  A distributed approach of integrating 2 or even 3 units would seem best to cover the whole sphere around a vehicle.  The other approach is to sacrifice some knowledge in some sectors to maximise capability in the most important sectors.  So for most of our applications, the lower and forward sectors would seem most appropriate to focus on.  Of course there are other ways of going where novel vehicle designs can accommodate these requirements more easily, but there's a need to retro-fit a lot of existing vehicles.

• Neato has LIDAR spinning 360 degrees to read all around it. They sell for \$100 bucks on ebay although one is on auction now at .99 cents. now as impressive as the one you showed here Chris, do you think a neato LIDAR would be able to be used for coverage all around the unit. I mean if a vacuum cleaner can use it why wouldn't it incorporate into flying robots? I can see speed of the drone becoming an issue if going too fast but otherwise you think it would work.

• Agreed Sam, however vision based drone navigation systems are scarce at the moment.

I think navigation systems that need to fly in a "choatic" space are a different challenge to those needing to fly in a structured space. In the warehouse it is very structured in a rectangular layout, we use simple distance measurements because all dimensions are known to us in our database. However, apart from Dan above, there appear to none of these simple systems available either!

The good news is that we see more and more sensors being integrated into the standard FCU's, PixHawk supports  lightware laser range finders, sonar and optical flow ,so its a matter of time before the FCU itself will be doing the navigation. We have designed our system modular, so as more and more support for the sensors becomes available on the stock FCU's we will offload those sensors from our system on to the FCU. Eventually all we will need to provide is our unique payload, the FCU will handle all of the sensors.