Osram's Laser Chip for Lidar

4-channel LIDAR laser


Dimensions 8 mm x 5 mm
Peak optical output 85 W at 30 A per channel
Wavelength 905 nm
Pulse length < 5 ns
Operating voltage 24 V

The overall lidar system covers 120 degrees in the horizontal plane, with 0.1 degree of resolution, and 20 degrees in the vertical plane, with 0.5 degree of resolution. In the light of day, it should detect cars from at least 200 meters away, and pedestrians at 70 meters out.

The MEMS chip can operate at up to 2 kilohertz

The company says test samples will be available in 2017, and that commercial models could arrive in 2018.

With mass production, the price should drop to around 40 Euros (US $43.50)



E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones


  • Global,

    From You:

    "sorry Gary but resolution of Lidar exactly degrades for far objects as camera image.


    In no way does this make the claim that the accuracy of Lidar degrades over distance.

    In fact, Lidar is measured as "Time Of Flight" by very accurate clocks so the accuracy at 1 foot is identical to the accuracy at 1000 feet as it is discrete time units.

    This is the actual quote from the above article you linked:

    "Light moves at a constant and known speed so the LiDAR instrument can calculate the distance between itself and the target with high accuracy."

    If it has 1/4" accuracy at one foot it can have 1/4" accuracy at 1000 feet.

    Modern scanning LIDAR systems are now almost all Direct and simply depend on the accuracy of measuring each laser pulses time of flight.

    Camera distance determination is done by visual angular offset of lines or "areas" computationally determined to be coincident either from two cameras in a true stereoscopic setup or in the same camera from pictures taken at different times with a known displacement between shots.

    With the camera scenario, the further away the visualized object is the less the angular difference between cameras (or successive shots) and therefore the less accurate it is.

    In fact this difference in accuracy is extreme

    A camera or cameras are in no way an equivalent device to a Lidar.

    They can be fine for X/Y measurements as they are an X/Y device, but they completely lack the intrinsic granular accuracy of LIDAR which actively measures the distance to each spot.

    It appears to me that your claim of equivalence is completely baseless.

    Because it is time scan based, TOF Lidar can be scan rate limited so you can select between update frequency, maximum distance or X/Y resolution for the envelope that best captures what you want.

    Also since Lidar actually requires dynamic laser illumination it is limited as to how far the laser can effectively be reflected off the target environment.

    But as long as you can achieve a valid reflected return the accuracy is guaranteed to be equivalent to clock accuracy.

    The angular separation between stereo camera readings simply becomes less and less as distance increases so the resolution becomes equivalently less and less.

    I believe these simple facts are - simply - indisputable.

  • Brain fart - is anyone attempting to combine 3D point cloud phased array radar chips with the solid state LIDAR chips as a single chip? Are the semiconductor processes even compatible?

  • @Gary


    resolution greatly degrades with increasing distance where as with Lidar remains constant.


    sorry Gary but resolution of Lidar exactly degrades for far objects as camera image.



    The actual calculation for measuring how far a returning light photon has travelled to and from an object is quite simple:

    Distance = (Speed of Light x Time of Flight) / 2

    The LiDAR instrument fires rapid pulses of laser light at a surface, some at up to 150,000 pulses per second.


    10Mpix camera clocked 30fps or 60fps, delivers 300M- 600M single point image data

    How fast has LIDAR laser diode to be pulsed to get the same data volume ?

    at 2-3 nanosec frequency


    The resolution based on resolution of camera pixels is only valid for x - y data not Z (depth).

    Multiple (more than 2) images can refine depth estimate, but it still never approaches accuracy of LIDAR.


    read above


    And with the new short pulse LIDAR there is simply NO eye safety hazard.


    Are you ready to sit for such test and spend 1 hour directly exposed to short pulsed, high power LIDAR laser ?

  • Hi LD and Patrick,

    While Innoluence is quoting EU2495.00 for its Mems test/development board that really is not representative of large quantities of the chip itself and is completely normal for large corporations for their early development board products.

    The Osram chip isn't available even on demo boards yet, but the claim is that it will be available in 2 years in large quantities for EU40.00.

    To me that translates with the rest of the circuitry involved into an automotive scale manufacturing cost probably below EU200 per complete LIDAR unit.

    It does not necessarily mean we are going to be able to buy them for EU200.00 per unit.

    That will depend on whether the manufacturers are interested in selling to small, independent or DIY developers at all.

    Osram, certainly is used to selling headlights for mass produced automobiles and has shown little support for small enterprise.

    That said, they are not alone in this endeavor and clearly the SS laser and MEMs mirror approach for future LIDAR development is being taken very seriously, so it does seem like a trend likely to continue across the board.

    I agree that mixed camera and LIDAR are the likely key (sensor) ingredients in future 3D vision or navigation systems, but the clarity of 3D solids detection and position that is performed by the LIDAR and the ease of obtaining and using that information to me make it likely to be the star player.

    By any means, cameras are just an interpretation of reflected light and are subject to ambiguity in software interpretation, Lidar essentially is not.

    Very well illustrated, unfortunately, by the recent fatal Tesla accident where the camera - computer apparently interpreted the side of a white truck van to be clear sky.

    Best Regards,


  • Lidar is definitely the Graal for UAVs, the take-off is only months ahead. Listing all contenders in phased array/VCSELs would be as much awesome as it would be harrassing.

    I believe stereophotogrammetry will find its niche in microUAVs and ethology. I fiermlier believe it is just enough time to get prepared for the LIDAR technology. From what I understand of the market, there will soon be a time where LIDAR performances will outscale flight control and AI performances, from that point I think it will be some time before flight agility fully matches back up that new level of situational awareness. What I mean is that when such OSRAM chips are out, what will refrain small UAVs to agressively move in confined spaces, as in some Hollywood teleporting scenes,  like no cat, bat, nor monkey could dream of? Software is not ready for that :)

    Back at our Laplace transform and SLAM courses!

  • On a philosophical technology tangent:

    My question is if anyone wants self driving cars at all, if most of the time they just stand around in traffic, or at home in the garage anyway? Better utilisation would be to just sit at home, in the car parked in the garage, and telecommute? I suspect self driving will become a Uber service for Joe Public and for those few that can afford personal mobility in a energy constrained world. Is personal mobility even a requirement in an automated world? And if it is wouldn't you rather zip there in a hyperloop in nearly no time at all, that however, in itself does not actually need sense and avoid anyway? Talk about making solutions that are in search of a problem!

    Technology convergence means that the things we take for granted today, will sound like an absurd idea tomorrow. When was the last time anyone saw a horse drawn carriage on the Tesla riddled autobahn? What even drives us to pursue this ridiculous notion that an investment in even greater speed is required to save the little precious time we have left to live? Traffic congestion is the result of poor scheduling that results in bad infrastructure and resource utilisation. This applies to vehicular traffic on land, sea and in the air, as it does to data on the internet, food production, economic development, energy consumption, or manufacturing and delivery of nearly any services and products, including it would seem, to photon and electron congestion, which makes it difficult for LIDAR to generate point clouds from OSRAM MEMS devices running faster than the eye can see! (just so we stay on subject!) 

    There are so many systems in between that need to be fixed, and at some point the only currency that actually exists, time, will become the predominate driver of our behaviour, at which point I expect us all to take a big breath, and consider why on earth we invested so much time into something, we didn't actually need, and most definitely didn't want. It's time for a change, but what, this time, should we change our future to?

    Please excuse the ramblings of a former, future technologist.

    So when can I finally buy this OSRAM powered LIDAR you're making LD?  ;-)

  • Yep, but the car market is big and the trend is strong, so it is plausible. 

    And then , it will be up to you to integrate these in a nice little package, just like you did with the SF40 :-)

  • @Patrick - that looks about right. There is some work to go from theory to a workable solution. Innoluce quoted EU2495.00 for a test board with one MEMS mirror and their ASIC driver to generate a single axis, fixed frequency sine wave pattern. I don't see an EU40 solution any time soon ;).

  • @Laser , thanks for you input,

    To get an idea of a complete system, here is a slide taken form Infineon presentation:


  • 100KM

    I thought this was solid state? Solid state FTW.

This reply was deleted.