Today, Velodyne is announcing a "a groundbreaking design for a solid-state lidar sensor that can deliver a subsystem cost of under $50 U.S. when sold in high-volume manufacturing scale.
Before:
After ?:
What you're seeing is the GaN integrated circuit that forms the heart of the solution that Velodyne is proposing. This is the chip that does the transmit function.
Comments
Hi LD,
Can't wait for your LW-20.
I would like to purchase a pair of them as soon as they are available.
Without case would be best.
Please let me know.
Best regards,
Gary
Fortunately the modern units that I've seen have addressed the interference issue with their more sophisticated signal processing approach. Using what amounts to an encryption and/or deliberately unique signature which is then analyzed by the receiver DSP to confirm a legit signal and determine range, regardless of interference source (sunlight, other ladars etc). This is opposed to the older simple time-domain process (which gets rid of the expensive uber-clock requirement, which is why they're getting so cheap and light).
@LD thank you very much for sharing, really interseting reading
Fantastic work and explanation in your earlier article LD
@Rob - The answer is yes to all of the above.
We originally designed the encoding system so that drones could have redundant laser altimeters aiming at the same patch of ground. We didn't realize how serious the interference problem was until we did flight testing and found crazy results in some of the data.
Once we started on the development (3 years ago) we realized that any solution would need to be extremely tolerant to signals from multiple sources, so we designed the encoding hardware to be fed from any standard encryption algorithm running in software. This allows for best practice to be applied on the encoding similar to secure Internet transactions.
Wow, that's very interesting LD.
So, in the application for cars though, will it even be possible to encode all the signals so they don't interfere? When you have 100's of cars on the same road, all with very high frequency systems? Can you do encoding on a signal when it's firing at the rate of 100,000 samples per second?
@Nikola Rabchevsky, @Thomas Stanley-Jones - This is a very interesting problem indeed! There is a huge difference between the effects of light pollution in a LiDAR scenario compared with the situation found in astronomy. I suspect that this is something that the VC's investing in LiDAR for automotive applications haven't yet considered.
In astronomy, light pollution shows up during long exposure photography. Light from streetlights at ground level scatters off dust particles high in the air producing a whitening of the normally dark background. This reduces the contrast of faint objects. The critical point here is that exposure times are in the order of seconds or longer so any transient changes in the intensity of the the background light are averaged out to produce the milky white background in the image - you can think of this as white noise with fairly uniform spatial distribution and low intensity.
The LiDAR situation works completely differently. The signals have a very short duration (equivalent to a very short exposure time) AND the background noise from other LiDAR systems also has a very short duration. In fact, there is no difference between the return signals from your LiDAR and the ones from the vehicle next to you. Unlike astronomy, the background noise doesn't show up as a general increase in the intensity of the background light. Instead it shows up as full strength signals that occupy the same temporal and spacial locations as the signals that you want to detect. In the image below, which signal on the yellow line is yours and which one is from the other LiDAR?
The first time I saw this effect in action I couldn't believe it. Logically, it seemed almost impossible that two signals could occupy the same "detection widow" when coming from two different sources. Laser beams are relatively narrow and the time to take a measurement is very short with a long gap between each measurement. So in order to get interference you would need to have the beams aiming exactly parallel towards exactly the same point on a target whilst firing at exactly the same time. Impossible right? Wrong.
Firstly, the return signal doesn't come from the laser itself, it comes from the reflection off a target surface. This means that it can be detected from any direction, the laser beams don't need to be parallel. Secondly, the detectors in LiDAR systems have a field of view that is surprisingly wide, meaning that they can detect signals that don't come from directly ahead. Thirdly, adjacent laser systems will always fire at the same time.
It's the third item that floored me. How can totally independent laser sources fire at almost exactly the same time, to within nanoseconds of each other? It has to do with the precision of modern components. If you use a crystal as the timebase of your LiDAR system then it has a very precise frequency, to within a few parts per million or better. Two crystals in two different LiDARs will run at almost the same frequency, but not quite. This difference of a few parts per million starts to add up over time. So whilst initially, the lasers might be firing at totally different times, the slow drift of the crystals gradually changes the relatively firing moment until, surprise, surprise, the two LiDARs start to fire at almost the time.
Of course, they don't stay in sync for long but even a single overlap of return signals will give one or both LiDARs a false result. Of course, the more precise the crystals, the longer the LiDARs stayed locked together so that you can easily get many seconds of false results. To add insult to injury, the interfering signals look exactly like moving targets - you can figure this out for yourself ;).
So what is so special about the LW20 laser module that is due to be launched by LightWare in Jan 2017? It encodes its laser pulses so that it can identify it's own return signals against a background of numerous other laser sources. This is absolutely critical for applications where more than one LiDAR is used.
The press release was
http://venturebeat.com/2016/12/13/velodyne-lidar-announces-breakthr...
Basically, they're integrating the emitter & receiver on a single 4mm x 4mm $50 chip. The rotating platform, slip ring, optics, are still required, so at least $3200 for a 64 channel unit which is a lot less than the 5 figures for the current 64 channel unit. The sensor chip is within the pricing of existing laser rangefinders.
Another solid state Lidar,
There are at least 2 or 3 others in development out there; Osram and a couple Universities as I recall.
Clearly they are starting to realize the critical importance of this technology for perceiving 3D space in an optimally machine useful way.
Right now it is oriented primarily to automobiles, but in the long term, it is one of the most critical elements for all of robotics and environment relative perception.
Lidar has one huge advantage over conventional camera (or multi camera) perception for determining 3D space and navigating in it.
It produces useful data directly rather than requiring considerable software interpolation before it can be used.
That provides a tremendous speed advantage at the same time it requires considerably less computational resources.
of course, ultimately Lidar (or flash TOF cameras) will be supplemented by stereoscopic camera systems to provide additional information and discrimination.
Can't wait for some of the second generation stuff (like Lightware's new small Laser rangefinder) to become available, let alone this new third generation stuff.
Best Regards,
Gary
In astronomy we called it light pollution. Interesting problem...