Velodyne $50 U.S "Breakthrough" Solid State Lidar

Today, Velodyne is announcing a "a groundbreaking design for a solid-state lidar sensor that can deliver a subsystem cost of under $50 U.S. when sold in high-volume manufacturing scale.


After ?:

What you're seeing is the GaN integrated circuit that forms the heart of the solution that Velodyne is proposing. This is the chip that does the transmit function.

Views: 2865

Comment by Rob_Lefebvre on December 14, 2016 at 6:42am

Wow, pretty interesting development.  But, I won't get really excited until I can actually buy one.

Comment by JP on December 14, 2016 at 8:52am

Agree with Rob but that looks great

Comment by Nikola Rabchevsky on December 14, 2016 at 9:30am

Yeah, ok, high-volume probably means 100,000.  There's another company working on this type of tech and those were the numbers.  This stuff is being built for the automotive industry so it will be a few years before the DIY community gets its hands on it.
But I have a technical question: I've got an old X-band radar detector in my car.  That thing always went off in the vicinity of some kinds of automatic door openers.  Now it goes off in proximity to some newer high-end cars and SUVs.  So when there are hundreds of LIDAR sensors all spewing out light rays, how does the system reject light from other sensors?

Comment by Thomas Stanley-Jones on December 14, 2016 at 9:58am

In astronomy we called it light pollution.  Interesting problem...

Comment by Gary McCray on December 14, 2016 at 11:57am

Another solid state Lidar,

There are at least 2 or 3 others in development out there; Osram and a couple Universities as I recall.

Clearly they are starting to realize the critical importance of this technology for perceiving 3D space in an optimally machine useful way.

Right now it is oriented primarily to automobiles, but in the long term, it is one of the most critical elements for all of robotics and environment relative perception.

Lidar has one huge advantage over conventional camera (or multi camera) perception for determining 3D space and navigating in it.

It produces useful data directly rather than requiring considerable software interpolation before it can be used.

That provides a tremendous speed advantage at the same time it requires considerably less computational resources.

of course, ultimately Lidar (or flash TOF cameras) will be supplemented by stereoscopic camera systems to provide additional information and discrimination.

Can't wait for some of the second generation stuff (like Lightware's new small Laser rangefinder) to become available, let alone this new third generation stuff.

Best Regards,


Comment by Jack Crossfire on December 14, 2016 at 12:20pm

The press release was

Basically, they're integrating the emitter & receiver on a single 4mm x 4mm $50 chip. The rotating platform, slip ring, optics, are still required, so at least $3200 for a 64 channel unit which is a lot less than the 5 figures for the current 64 channel unit. The sensor chip is within the pricing of existing laser rangefinders. 

Comment by Laser Developer on December 15, 2016 at 12:29am

@Nikola Rabchevsky, @Thomas Stanley-Jones - This is a very interesting problem indeed! There is a huge difference between the effects of light pollution in a LiDAR scenario compared with the situation found in astronomy. I suspect that this is something that the VC's investing in LiDAR for automotive applications haven't yet considered.

In astronomy, light pollution shows up during long exposure photography. Light from streetlights at ground level scatters off dust particles high in the air producing a whitening of the normally dark background. This reduces the contrast of faint objects. The critical point here is that exposure times are in the order of seconds or longer so any transient changes in the intensity of the the background light are averaged out to produce the milky white background in the image - you can think of this as white noise with fairly uniform spatial distribution and low intensity.

The LiDAR situation works completely differently. The signals have a very short duration (equivalent to a very short exposure time) AND the background noise from other LiDAR systems also has a very short duration. In fact, there is no difference between the return signals from your LiDAR and the ones from the vehicle next to you. Unlike astronomy, the background noise doesn't show up as a general increase in the intensity of the background light. Instead it shows up as full strength signals that occupy the same temporal and spacial locations as the signals that you want to detect. In the image below, which signal on the yellow line is yours and which one is from the other LiDAR?

The first time I saw this effect in action I couldn't believe it. Logically, it seemed almost impossible that two signals could occupy the same "detection widow" when coming from two different sources. Laser beams are relatively narrow and the time to take a measurement is very short with a long gap between each measurement. So in order to get interference you would need to have the beams aiming exactly parallel towards exactly the same point on a target whilst firing at exactly the same time. Impossible right? Wrong.

Firstly, the return signal doesn't come from the laser itself, it comes from the reflection off a target surface. This means that it can be detected from any direction, the laser beams don't need to be parallel. Secondly, the detectors in LiDAR systems have a field of view that is surprisingly wide, meaning that they can detect signals that don't come from directly ahead. Thirdly, adjacent laser systems will always fire at the same time.

It's the third item that floored me. How can totally independent laser sources fire at almost exactly the same time, to within nanoseconds of each other? It has to do with the precision of modern components. If you use a crystal as the timebase of your LiDAR system then it has a very precise frequency, to within a few parts per million or better. Two crystals in two different LiDARs will run at almost the same frequency, but not quite. This difference of a few parts per million starts to add up over time. So whilst initially, the lasers might be firing at totally different times, the slow drift of the crystals gradually changes the relatively firing moment until, surprise, surprise, the two LiDARs start to fire at almost the time.

Of course, they don't stay in sync for long but even a single overlap of return signals will give one or both LiDARs a false result. Of course, the more precise the crystals, the longer the LiDARs stayed locked together so that you can easily get many seconds of false results. To add insult to injury, the interfering signals look exactly like moving targets - you can figure this out for yourself ;).

So what is so special about the LW20 laser module that is due to be launched by LightWare in Jan 2017? It encodes its laser pulses so that it can identify it's own return signals against a background of numerous other laser sources. This is absolutely critical for applications where more than one LiDAR is used.

Comment by Rob_Lefebvre on December 15, 2016 at 5:51am

Wow, that's very interesting LD.

So, in the application for cars though, will it even be possible to encode all the signals so they don't interfere?  When you have 100's of cars on the same road, all with very high frequency systems?  Can you do encoding on a signal when it's firing at the rate of 100,000 samples per second?

Comment by Laser Developer on December 15, 2016 at 6:32am

@Rob - The answer is yes to all of the above.

We originally designed the encoding system so that drones could have redundant laser altimeters aiming at the same patch of ground. We didn't realize how serious the interference problem was until we did flight testing and found crazy results in some of the data.

Once we started on the development (3 years ago) we realized that any solution would need to be extremely tolerant to signals from multiple sources, so we designed the encoding hardware to be fed from any standard encryption algorithm running in software. This allows for best practice to be applied on the encoding similar to secure Internet transactions.

Comment by Cliff Dearden on December 15, 2016 at 9:42am

Fantastic work and explanation in your earlier article LD


You need to be a member of DIY Drones to add comments!

Join DIY Drones

© 2020   Created by Chris Anderson.   Powered by

Badges  |  Report an Issue  |  Terms of Service