By Mark Crawford
Autonomous vehicles (AVs) in development today rely on a system of cameras, sensors, and radar to automate features such as cruise control, emergency braking, and blind spot monitoring—yet what cars equipped with these advanced technologies can “see” is still limited in terms of range and depth.
Lidar, however, is a next-generation technology that increases an AVs’ ability to see by providing high-precision, precise, 3D images of the vehicle’s surroundings, which increases safety and response time. Lidar technology continues to evolve—for example, it can now detect objects up to 400 meters away, colors and reflectivity, and operate better in low-light conditions. Improvements continue to be made in optical phased array (OPA) technology, which is 100% solid-state lidar and therefore resistant to vibration, providing “rich data for object detection, tracking, and classification, and the support of a variety of functions including truck platooning, ADAS [advanced driver-assistance systems] functions, autonomous valet parking, and self-navigation in geo-fenced areas,” says Tianyue Yu, chief development officer for Quanergy.
Current AV-Lidar Market
The market for automotive lidar is very strong, both for technology R&D and customer interest. Lidar technology with its high-resolution 3D imaging is vital to the future of transportation. Many automakers have decided that camera and radar-based perception is not sufficient for full autonomy; they also need the ground-truth reliability that lidar provides.
According to Tracxn, a market intelligence firm, there are about 65 autonomous vehicle lidar companies around the world. Below are six leaders that are doing advanced R&D in this field:
- Ouster: Lidar sensors for long, mid, and short-range applications, as well as an ultra-wide lidar sensor with a built-in inertial measurement unit (IMU)
- Velodyne: Lidar-based directional, close, and surround sensors for automotive, smart city, and robotic applications
- Luminar: Its hardware and proprietary software-enabled lidar includes slim designed roofline-attached customized sensors
- Quanergy: High-performance mechanical lidar sensors enable a variety of mapping, security, and smart city and smart space applications
- Aeye: Provides solid-state lidar with ADAS functionality, as well as AI-enabled software for perception with features like real-time integration of pixels
- Sense Photonics: Proprietary laser arrays, interference mitigation, and 3D depth-sensing technologies to build flash lidars that illuminate scenes
Lidar is a critical sensing technology for automating the safe and efficient transportation of people and goods. “Lidar sensors are already being adopted in on- and off-road vehicles, from consumer vehicles and autonomous shuttles to autonomous long-haul trucks and mining vehicles,” states Heather Shapiro, director of communications for Ouster. “Lidar powers everything from drones and farm equipment to last-mile delivery robots.”
Matt Weed, senior director of product management for Luminar, believes the first large-scale implementation of on-road autonomy will likely be for highway driving. “Enabling ‘eyes-off, hands-off’ highway autonomy on production vehicles across consumer cars and commercial trucking represents the biggest near-term market opportunity for autonomous technology,” he says.
With the recent signing of the new infrastructure bill into law by the Biden administration, which includes billions of dollars to be spent on roads, broadband internet, and electric utilities, lidar technology will be pushed further into the forefront of the AV market. The bill includes $15 billion for electric vehicles, including a $7.5 billion investment in charging stations for electric vehicles.
Several safety improvement measures in the bill will also benefit from the implementation of lidar, including the Safe Streets and Roads for All Grant Program which will support new vehicle or transportation-related technologies to reduce or eliminate roadway fatalities and injuries.
Latest Developments
To achieve widespread adoption, lidar sensors must be highly performant, small form-factor, power-efficient, and affordable. Lidar technology can meet these requirements across the board, broadening the possibilities for lidar adoption. By building digital technologies on silicon CMOS chips, Ouster can improve performance and drive down costs due to the simplified digital system architecture. “The ability to drive greater performance at a lower cost will open up new opportunities for additional applications to leverage lidar technology, to the point where digital lidar technology becomes as commonplace as the digital camera,” says Shapiro.
Luminar is working to industrialize its first series production lidar sensor. Building on the strengths of its automotive-grade Iris Lidar, the company is developing an integrated hardware and software solution that enables a proactive approach to safety. This proactive safety system “surpasses traditional ADAS capabilities in efficacy and speed because it provides the vehicle with higher-confidence detections of all the critical risk cases faster and farther away than camera and radar,” says Weed.
Outside of vehicles, lidar infrastructure-mounted sensors, like Velodyne’s Intelligent Infrastructure Solution (IIS), can identify issues at an intersection and communicate them back to the vehicle, well ahead of its arrival. Existing technology does not allow for near-miss detection of all types of road users—Velodyne’s sensors, however, can analyze how close a particular pedestrian comes to getting hit by a vehicle and therefore better analyze usage patterns and answer questions about intersection safety.
Moving Forward
Even though lidar technology for AVs is still early on the adoption curve, more automotive companies are starting to utilize lidar in their semi or fully autonomous systems. “While some consumer vehicles use lidar today, we expect adoption to ramp up in 2025, in line with SAE [Society of Automotive Engineers] Level 3 ADAS programs,” says Shapiro.
For Luminar, Volvo has decided to utilize its technology on its next-generation electric SUV. “We were also recently designed into Nvidia’s autonomous driving reference design, which paves the way for future automotive design projects,” says Weed. “Our Iris sensor is on track for series production with partners including Volvo, SAIC, and Polestar in late 2022.”
Ouster Lidar is being integrated into traffic intersections, helping to make crosswalks safer and reduce roadway congestion and highway accidents. Ouster Lidar is also mounted on drones to monitor critical infrastructure such as bridges, railways, and power lines to prevent hazardous incidents. “Our sensors are also powering semi-autonomous freight trucks moving goods across the country, as well as sidewalk delivery robots bringing goods to your doorstep,” Shapiro adds.
Lidar technology can also aid in solving infrastructure problems, improving traffic and crowd flow efficiency, advancing sustainability, and protecting vulnerable road users.
This type of advanced technology will eventually be integrated into smart city infrastructure. For example, Velodyne’s IIS initiative uses lidar and best-in-class AI to create real-time 3D maps of roads and intersections. IIS improves traffic safety, including near-miss analytics that can be used to predict, diagnose, and address road safety challenges before the next collision happens. Today’s camera-based solutions require several cameras per intersection or identified public area, which typically take longer processing times to get the final analysis. Velodyne offers a real-time solution for all road users with only a single sensor required for an entire intersection.
Other smart city applications for lidar technology include using extracted trajectory road user data around intersections to predict potential collisions. Lidar can detect collisions and near-miss incidents in real-time to provide data to emergency response services for faster dispatch in both urban and rural environments. IIS also delivers reliable real-time traffic data to optimize traffic light timing based on congestion and throughput in all types of weather and lighting conditions. This covers all road users, including vulnerable pedestrians and cyclists, whereas current technologies provide data for vehicles only. Other sensors are designed for autonomous applications in sidewalk, commercial and industrial settings and can even enable robots to safely navigate crowded urban areas and corridors for delivery and security applications.