Follow
Subscribe

LiDAR Needs to Evolve for Self-Driving Cars

Home > News > Content

【Summary】The automotive industry has a heavy interest in Lidar, with OEM's and suppliers racing to employ new, non-mechanical technologies for mass production of self-driving cars.

Original Eric Walz    May 19, 2017 3:51 PM PT
LiDAR Needs to Evolve for Self-Driving Cars

By Eric Walz

Many autonomous-driving development plans rely on deploying a group of solid-state LiDAR sensors on each vehicle to scale for production. However, the LiDAR modules used for today's prototype vehicles are bulky, expensive, mechanical systems with moving parts.

Google, Uber, and GM all use self-driving test vehicles that are easily identifiable by the large spinning LiDAR system mounted to the roof. The automotive industry has a heavy interest in LiDAR technology, with OEMs and suppliers racing to employ new, non-mechanical technologies for mass production of self-driving cars.

What is LiDAR?

LiDAR an acronym for "light detection and ranging," denotes a procedure in which a light pulse is emitted to determine a distance based on runtime and speed of light. LiDAR is an optical method for measuring distance and speed that is very similar to radar, except that laser pulses are used instead of radio waves.

The interest for automotive applications stems from LiDAR's advanced use of emitted laser light to measure the distance of objects, functioning much like radar. The laser lets the system provide high resolution imagery in low-light environments, and in rain or snow.

delphi-lidar-how-it-works.jpg

High-resolution Lidar is a necessary technology for autonomous driving because its capabilities are available in all lighting and weather conditions," said Dean McConnell, Director of Customer Programs, Advanced Driver Assistance Systems, at Continental North America. "We're capturing images at 30 Hz, constructing 3D point clusters thirty times per second."

The technology also helps advanced driver assistance systems (ADAS) zero-in on objects of interest. "Lidar acts more like the human eye: it views a broad scene, doing a quick scan, then if it sees something interesting, it can focus in on that," said Chris Jacobs, General Manager of Automotive Safety for ADI.

Several small companies have developed solid-state LiDAR technologies that are not quite ready for automotive application and some of these companies were purchased by major automotive companies past 18 months. Ford recently made a large investment in Velodyne, Continental acquired Advanced Scientific Concepts, while ZF bought a 40% stake in Ibeo, and Analog Devices Inc. (ADI) acquired Vescent Photonics Inc., in 2016.

Solid State LiDAR

LiDAR providers, including Velodyne, are currently working to develop compact solid-state modules because the large mechanical "pucks" now used by autonomous-driving researchers are too bulky and costly to go into production vehicles. Researchers are striving to shrink sizes and that still offer the necessary range and field of view for self-driving cars. One such company, Quanergy, has recently introduced a small solid state LiDAR.

quanergy.png

The Quanergy S3 Model Solid State Lidar

"Our solid-state box measures 9 x 6 x 6 cm, about the size of two decks of cards," said Louay Eldada, Quanergy's CEO. "Currently, it has a 120-degree field of view, so with three you have 360 degree coverage. There will always be two in the front, on the right and left sides, and one in the back middle or one on each corner."

Developmental Challenges for Autonomous Driving

Determining the vehicle's distance to objects is a key parameter for safety, can be increased by narrowing the field of view. Developers are trying to achieve the same distance levels as cameras and radar, with a goal of around 200 m (656 ft). To achieve suitable distance performance, several tradeoffs are being considered.

Location points are key parameters that help determine field-of-view coverage; modules looking to sides, for example, do not require the same range capability as forward-facing units, so their field of view can be wider.

"We've demonstrated 70 meters (230 ft) with a 15-degree field of view, which is clearly not sufficient," said Aaron Jefferson, Director of Product Planning for ZF's Active and Passive Safety Division. "It needs to go up to 50 or 60 degrees to start. When the cost gets down, it's conceivable that they could be integrated into taillights and headlights."

Managing Large Amounts of LiDAR Data

LiDAR will complementing cameras and radar, providing information that typically will be stitched together with the data from other vehicle sensors to create a reliable image of vehicle surroundings. All these sensors generate terabytes of data, making communication protocols and efficient data management an important consideration in the design process.

data.jpg

Processing the huge amount of data generated by an autonomous vehicle efficiently is a concern for self-driving car makers. 3D LiDAR sensors create a significant amount of data, but similar to radar and digital cameras, there are cropping, resolution, and compression methods to help minimize the amount of data, eliminate useless or unimportant data and extract the detail from the data of concern. Techniques can be applied to filter data, group, and identify objects. This determines the physical size of the data that needs to be processed.

"Solid-state LiDAR will be in production later this year, but for pilots and software development, you don't need solid-state," Quanergy's CEO Eldada said. "Though we plan to ship solid state products in September, we won't have automotive-grade parts ready until a year later."

LiDAR to Supplement Existing Vehicle Sensors

Once LiDAR is in use, many developers don't expect it to displace many other sensors. A range of technologies is needed to provide the capability and redundancy needed to drive autonomously in all weather conditions.

"We do not see 3D LiDAR as a sensor replacement, but rather as an innovation that can enables the high resolution sensing needed to realize Level 4-plus automated driving," Aaron Jefferson said. "3D solid-state LiDAR, camera, radar, ultrasonic sensing and other technologies will continue to play a role—a combination of these will be necessary to properly sense the vehicle environment in 360 degrees, in real time."

Louay Eldada, Quanergy's CEO disagrees."Ultrasonics will go away, "video is needed for color, things like seeing traffic lights. Fusing Lidar and cameras 'colorizes' our data so it's more valuable. Radar is needed for redundancy; you need another sensor before deciding to steer or hit the brakes."

Prev                  Next
Writer's other posts
Comments:
    Related Content