3689664479?profile=original

Small LIDAR (scanning laser) systems can be used for obstacle detection and SLAM from a moving drone. Whilst more work is still needed on the integration of the data from these sensors into the flight software, the theoretical performance can be evaluated based on the specifications of the laser and the mechanical movement.

Ideally, we would like to have an obstacle detection system that has no "gaps" in the data, gaps which might allow small objects to get dangerously close. We would also like each scan of the surrounding area to be updated instantaneously. Of course, these requirements may not be practical or cost effective but it is still useful to know how good a particular LIDAR might be in practical circumstances.

For the purposes of this discussion, I will examine a simple single axis scanner, since these are the most commonly available type of small LIDAR, but the same analysis can be applied to multi-axis or multi-beam devices.

Looking first at the refresh rate of the entire data set, this is determined by the time that it takes the LIDAR to complete a full set of measurements as controlled by the speed of the motor. We can call this rate Frefresh [Hz]. 

Taking the laser measuring rate as Flaser [Hz] we can calculate the point separation, Psep [deg] of each measurement as follows:

Point separation:   Psep = 360 * Frefresh / Flaser

The closer the point separation, the smaller the gap will be in the data that allows obstacles to go undetected. There is always some divergence on the laser beam, so if the point separation is less than or equal to this laser beam divergence then the data can be regarded as "saturated" in the sense that there is 100% coverage and no obstacles will be missed. This means that our first requirement for no "gaps" in the data is surprisingly possible for some combinations of refresh rate and laser measuring rate.

Let's take a closer look at the refresh rate. Suppose that we consider drones traveling at a speed of Vdrone [m/s] where the time taken to stop or take avoiding action is Tstop [s]. The stopping distance, Dstop [m] can be calculated as:

Stopping distance:   Dstop = Vdrone * Tstop

Even if our LIDAR has zero refresh time, it has to be able to measure at least this stopping distance to prevent the drone from crashing. For a longer refresh time it needs to be able to measure further in order to give sufficient time for the drone to stop and sufficient time for a complete refresh of the data. The measuring range, Drange [m] of the LIDAR needs to be:

Measuring range:   Drange = Vdrone * (1 / Frefresh + Tstop)

--------------------------------------------------------------

Let's look at an example:

A drone is traveling at 30kph and can stop in 2 seconds. Can a LIDAR with a range of 25m, a beam divergence of 0.2 degrees and taking 500 readings per second protect this drone from hitting a telephone pole?

We can regard a telephone pole as a small target so to get 100% coverage we would need:

Maximum refresh rate:   Frefresh = 0.2 / 360 * 500 = 0.28 [Hz]

At this refresh rate the required range would be: Drange = 8.3 * (1 / 0.28 + 2) = 46.2 m

This range is further than the LIDAR can measure so the drone might well hit the pole, even if the LIDAR does see it.

--------------------------------------------------------------

I think the most surprising thing about this result is that a LIDAR needs to have BOTH a very fast update rate and long measuring range before it can effectively protect a drone from hitting "small" objects like poles, wires, tree branches and all the things that drones seem inexplicably attracted to. It is for these reasons that we have been working on long range, high speed laser modules that can be built into LIDAR systems. The SF40 LIDAR pictured above (SF40 web page) uses an SF30/C laser module (SF30/C web page), so how good will its performance be in practice?

--------------------------------------------------------------

As a second example, the SF40 LIDAR is configured to measure at 10kHz and rotates 5 times per second. The laser module can detect thin wires at 25m, poles at 50m and walls and trees at 100m. If a drone can stop in 2 seconds, how fast can it fly and still safely avoid an unexpected tree?

Point separation:   Psep = 360 * 5 / 10000 = 0.18 degrees

This means that there is 100% saturation and no gaps in the data, so even small obstacles will be detected.

The maximum safe speed of the drone: Vdrone = 100 / (1 / 5 + 2) = 45 m/s (164 kph)

How fast could this drone fly without hitting a power line? 

The maximum safe speed of the drone: Vdrone = 25 / (1 / 5 + 2) = 11.4 m/s (41 kph)

This result suggests that the SF40 LIDAR will work well for both high speed operation, where large obstacles might be encountered, or slower speed operation closer to small obstacles.

--------------------------------------------------------------

Fly safely :) LD

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • Moderator

    Hi LD, 

    We are using LX30 ... with angle of view of 270 degree . 

    We are already implemented obstacle advoidnce and Slam functionality it work very well we ,too have an a version that already work on RPI2. If you have a sample we can test it for you and help to develop a ros driver.

    my email is : r.navoni@virtualrobotix.com if you need more information can you contact me there or at my skype account : virtualrobotix  

    Best

    Roberto 

  • That's cool Roberto! Can you tell us which Hokuyo sensor you are using? At this stage we have a Pi2 based viewer that is just looking at obstacle detection over long ranges. I'll post something soon showing the graphics then we can compare notes.

  • Moderator

    If you need some specific info i can produce it for you ... Actually in our design we are integrated our flight control in our ROS enviroment , Our companion computer VR Neuron use different kind of CPU RPI2 , Odroid , Imx , Qualcom. It's based on Ubuntu O.S.  

    Actually in our design we integrated all info from Hokuyou Lidar add an additional imu to Hokuyou and collect information by ROS driver then use the information with some different kind of 2D / 3D Slam Algorithm . 

    At this architecture add some custom developed driver for VRBrain for interface data from Slam and use it in indoor positioning application and for object advoidance application . Actually we are working in some advanced research project where the target is to develop powerfull and reliable object advoidance application.

    Here you can found an example of our application if you are interest to evaluate performance of your lidar we can doing some benchmark for you we are ready with a complete platform for evaluate the performance of your device . In the video on the left you can see the realtime lidar information in VR Lidar / Slam viewer on the right you can see the reconstructed map of flight enviroment with the position of the drone during the indoor flight test . The actual limit of hokuyou is the angle of view 270 degree but this isn't a big problem for positioning and mapping solution . 

    Best

    Roberto

  • @Gary @Roberto

    Thanks for the comments gents. This analysis is just theoretical and in practice there will be other effects to consider such as the deceleration profile and the angle of the airframe. This is beyond the scope of this post but FYI we are now making multi-beam laser modules that can provide some automatic correction to airframe angle and we also have initial processing capability based on the P2.

    @Roberto, thank you very much for the offer, I will let our software team know that you can help. Do you have any pictures or performance data for the Hokuyo LIDAR that you could show everyone here?

  • Moderator

    @LD 

    great work , but there si a iusse in your consideration that is the orientation of Laser and direction of flight og your drone. Infact for example at 40 mph your pitch could be at 20 degree 25 degree . When you flight you maintain altitude with a pitch of 20 degree so you can hit because your lidar don't check the direction of flight so you need a stabilized platform if your lidar have very narrow beam.  

    Actually i already developed a complete object advoidance systems based on hokuyo lidar if you need support to develop your ros driver and complete system i'm available to support in your work. 

    Best

    Roberto Navoni

  • Excellent work LD,

    Great analysis, definitely points out the value of the rapid refresh rate you have accomplished.

    Even for the more computationally intense work of SLAM, the saturated coverage and rapid data acquisition rate should provide really beneficial up front benefits.

    Best,

    Gary

This reply was deleted.