A key component of many autonomous driving systems is lidar (a portmanteau of light and radar), which bounces light — usually in the form of ultraviolet, visible, or near-infrared — off of objects to map them digitally in three dimensions. But while lidar systems are great for identifying potential obstacles, they don’t always spot those obstacles quickly. At a speed of 70 miles per hour, for instance, targeting an object 60 meters away doesn’t do much good if it takes the car 100 meters to come to a stop. Post-processing introduces another delay.

That is why the new sensor from startup AEye — the iDar — is built for speed first and foremost. The intelligent detection and ranging sensor merges 300 meters of 360-degree depth data with camera data to create a dynamic, manipulable point cloud that AEye calls True Color Lidar. When the sensor, which will cost less than $3,000, begins shipping to original equipment manufacturers in July, it will be the first on the market to combine lidar and camera data mechanically at the hardware level, AEye says.

“There is an ongoing argument about whether camera-based vision systems or lidar-based sensor systems are better,” said Luis Dussan, Founder and CEO of AEye. “Our answer is that both are required.”

The research has been spearheaded by Dr. Allan Steinhardt, whose impressive career includes a stint at DARPA, where he conducted research on space, terrestrial, and naval-based radar systems, and at Booz Allen Hamilton, where he served as a vice president and chief scientist. Steinhardt also headed radar search at MIT’s Lincoln…

[SOURCE]