4-channel LIDAR laser

Dimensions 8 mm x 5 mm
Peak optical output 85 W at 30 A per channel
Wavelength 905 nm
Pulse length < 5 ns
Operating voltage 24 V

The overall lidar system covers 120 degrees in the horizontal plane, with 0.1 degree of resolution, and 20 degrees in the vertical plane, with 0.5 degree of resolution. In the light of day, it should detect cars from at least 200 meters away, and pedestrians at 70 meters out.

The MEMS chip can operate at up to 2 kilohertz

The company says test samples will be available in 2017, and that commercial models could arrive in 2018.

With mass production, the price should drop to around 40 Euros (US $43.50)

http://spectrum.ieee.org/cars-that-think/transportation/sensors/osr...

http://www.osram-group.de/en/media/news/press-releases/pr-2016/07-1...

Views: 4990

Comment by Patrick Poirier on November 14, 2016 at 7:36am

Very interesting, so technically using 3 modules , we can get a full 360 degrees system with a 20 degrees vertical plane with a scan rate of 2khz under 150$... Long live MEMS !!!

Comment by JB on November 14, 2016 at 7:44am

Nice!

Comment by Global Innovator on November 14, 2016 at 10:08am

Could someone provide technical description to electronic parts featured in this LIDAR module ?

In theory MEMS controlled micro-mirrors can be replaced by digital optics (no moving parts).


100KM
Comment by Hein du Plessis on November 14, 2016 at 10:13am

Would this potentially replace the optical lidar for auto landing of fixed wings using pixhawk?

Comment by Gary McCray on November 14, 2016 at 11:29am

This is a very significant breakthrough, it is exactly what is needed to permit all kinds of autonomous vehicle and robot operation in our complex environment.

We have primarily been doing autonomous operations with the absolute position control of GPS which does not take into account at all surrounding obstacles, but only provides a position on the surface of the Earth.

Lately simple vision and lidar sysstems have been providing some position feedback and obstacle response capability.

This sensor provides the capability to operate relative to a static and dynamic object rich environment (basically the one we actually live in.

When this level of capability becomes available at the price quoted it will cause an explosion in autonomous ground based operations of all kinds.

It should be noted that although the maximum LASER power levels quoted here are very high, the actual pulses are exceedingly short and of very low average duration (.01% net on time) Eye hazard is proportional to total on time and in this case would represent no hazard whatsoever, pretty much if you were looking straight into it (though I am sure they will advise against that.)

The other main ingredient necessary to make a competent 3D vision / navigation system is a serious multi CPU or GPU core computing system, the Nvidia TX1 could work very well with this and by the time it is available, I expect there will be similar multi GPU based systems with 5 to 10 times the capability of the TX1 which should really be adequate.

Best,

Gary

Comment by Patrick Poirier on November 14, 2016 at 11:48am

Gary, that is exactly what JB and me were talking in a private chat :-). 

Looking at this laser as a reference :  http://www.excelitas.com/downloads/DTS_SMD_Laser.pdf 

There is a safe zone of operation for human safety, within this, longest pulse is 100 nsec at full power. And considering this type of unit is operated in intermittent mode, the power factor is much lower that the maximum continuous operation.

Concerning the required CPU, depending on the desired mission,  this device could be operated with RPI type using the new features that Randy and Tridge are experimenting. Basically it re-uses the defined fence but the distance to the fence is passed into the AP_Proximity library which is used by AC_Avoidance. This means you can set up a fairly complicated fence and then try out your object avoidance algorithms on it.  This is what Aerotenna did to win first prize at the Unmanned Traffic Management (UTM) Preliminary Drone Sense & Avoid technology

Comment by Gary McCray on November 14, 2016 at 12:12pm

Hi Patrick,

Of course you can use this with any CPU, but to get maximum benefit from it you need serious CPU horsepower to visualize, analyze and respond to a rich dynamic 3D environment like the one we actually live in.

Until now, affordable sensors have been performance limited, this one isn't.

The future of robotics and autonomous vehicles isn't going to be just detecting and responding to a few immediate obstacles it is going to be aware interaction with the external environment - different ball game.

Best,

Gary

Comment by Patrick Poirier on November 14, 2016 at 12:27pm

What is really interesting with these new devices in the DIY community, is the possibility to experiment advanced features with a reasonable price tag. Wouldn't it quite a nice if we could achieve a factor of 10 in price reduction for a complete Lidar system within the next 2 years ?

Comment by Gary McCray on November 14, 2016 at 2:21pm

Hi Patrick,

Back in August, MIT actually introduced this style of single chip Lidar solution.

http://spectrum.ieee.org/tech-talk/semiconductors/optoelectronics/m...

And it really does change everything.

The Osram chip in this article is just an enhanced version of it using MEMs instead of light guides, antennas and phase shifters.

These single chip sensors are truly revolutionary and will make this technology available very cheaply.

The potential new uses for these that haven't even been thought of yet are nearly boundless.

I can't wait for these to become available.

Best,

Gary

Comment by Gary McCray on November 14, 2016 at 2:36pm

It is important to understand that the MIT chip is a complete single chip LIDAR where as the Osram chip discussed here is simply a 4 channel pulsed laser which requires an additional MEMs chip for beam steering as well as a sensor chip to acquire the return signal.

That said, this chip actually has the power and coverage for many useful applications whereas the existing MIT chip is seriously limited on range.

The MIT chip will undoubtedly continue to evolve, but the system shown here is almost certain to provide a low cost viable solution much more quickly.

Best Regards,

Gary

Comment

You need to be a member of DIY Drones to add comments!

Join DIY Drones

Groups

Season Two of the Trust Time Trial (T3) Contest 
A list of all T3 contests is here. The current round, the Vertical Horizontal one, is here

© 2019   Created by Chris Anderson.   Powered by

Badges  |  Report an Issue  |  Terms of Service