A piece about LiDARs from our friends at EETimes
the automotive LiDAR market for 2027 is expected to reach US $2.0 billion, up from US$38 million in 2021. What emerging trends do you see in LiDAR technologies?
Cameras, Radars, LiDARs: Sensing the Road Ahead
May 16, 2023
Anne-Françoise Pelé
“We will see different generations of sensors and computing that will be more performant, enabling different levels of autonomy,” says Yole Intelligence analyst Pierrick Boulay.
Advertisement
Sensors have become prevalent in advanced driver-assistance (ADAS) and autonomous-driving systems to enhance passengers’ safety and protect vulnerable road users. A combination of cameras, radars and LiDARs is now used to scan the vehicle’s environment, process data in real time and respond more quickly and accurately than human drivers can.
To gain an objective picture of the current market trends in automotive cameras, radars and LiDARs, EE Times Europe consulted Pierrick Boulay, senior technology and market analyst in the Photonics and Sensing Division at Yole Intelligence, part of Yole Group. Pierre Cambou, principal analyst in the Photonics and Sensing Division at Yole Intelligence, also contributed to the analysis.
Yole analyst Pierrick Boulay
EE Times Europe: More sensors are being deployed in the vehicle to respond to safety issues proactively, but the sensor count can’t increase indefinitely. Will sensors be more specific in that we will see sensors for short-, mid- and long-range applications?
Pierrick Boulay: It is true that the number of sensors cannot increase indefinitely, for cost reasons but also for integration reasons. Nobody wants to have a car that looks like a robotaxi. Each sensor must be perfectly integrated or hidden, like radars. There are already sensors in cars dedicated to short-, mid- or long-range applications. In cars, radars can be dedicated to short-/mid-range applications and to long-range applications. These are different radars, implemented in different locations of the car. LiDARs implemented in cars are mostly for long-range, but in the future, we also expect to see short-range LiDARs for use cases such as automated lane change or inter-urban driving applications between highways—or even for city driving, where it is necessary to monitor the surroundings of the car perfectly to avoid any blind zones. We will see different generations of sensors and computing that will be more performant, enabling different levels of autonomy.
Cameras
EETE: According to Yole, there were 2.6 cameras per car produced in 2021 on average, and the number is projected to rise to 4.6 cameras per car by 2027. Advanced prototypes presented today are typically fitted with 11 to 12 cameras, and the roadmap extends beyond 20 cameras per car for the long-awaited consumer AVs. What are the technology and market trends in thermal, visible-imaging and night-vision cameras?
Boulay: Thermal cameras are used only in a few car models due to their high cost and low resolution. The primary use case is to detect pedestrians or animals on the road. The growth of such cameras could only be expected with a lower price point. Today, thermal cameras are still very expensive, but they are starting to be implemented in AVs such as Waymo, Cruise and Zoox. Probably a few more years are needed to make them mainstream ADAS.
For visible imagers, the trend is to increase the resolution. A few years ago, the forward ADAS camera had a resolution of 1.3 megapixels, but we are now already at 8 megapixels, and the resolution is still expected to increase. This is not the only trend. The frame rate is also increasing, with 60 frames per second today, while some players are already looking for a higher frame rate. Dynamic range is also important to enable operation in low-light conditions and rapid transitions from low light to intense light when exiting a tunnel, for example. We can also highlight the development of cameras to ignore flickering light. For example, traffic lights are made from LEDs, and the way LEDs are driven induces a light flicker that could be misinterpreted by the processing camera.
EETE: Visible cameras can fail to detect dangerous situations at night or in bad weather conditions, where thermal cameras are very efficient. Will we see them work increasingly in tandem?
Boulay: It is true that visible cameras are not good in low light or bad weather. Cameras in the far infrared (FIR) or short-wave infrared (SWIR) could solve this issue. But these cameras are still too expensive, and their performance needs to be improved to implement them in cars.
No single modality can solve the AV technology challenge. Cameras can improve in resolution, speed and dynamic range, but then the problem is downstream real-time computing power. New approaches, such as event-based cameras, could bring additional performance, but then developers must rethink their systems from the ground up. Regardless, cameras will have to work in tandem with 4D radars, LiDAR, and Inertial/GNSS units. One thing is certain: Advances in ADAS and AV technology will require more cameras and more diversified camera types.
Radar
EETE: According to Yole, the automotive radar platform market reached US$5.8 billion in 2021 and is expected to grow at 14% CAGR to US$12.8 billion by 2027. What are the main drivers for growth?
Boulay: The main drivers for radar implementation are linked to safety features, like AEB (Autonomous Emergency Braking) functionality, that work in coordination with the forward ADAS camera. For many years, radars have been used for blind spot detection at the rear of the car. More recently, a new driver for radar has been linked to safety in complex intersections. To help solve this issue, OEMs increasingly are implementing front-corner radar. A final driver is linked to in-cabin sensing, where the radar will be used for child presence detection. CPD is the only technology to detect a child even if the child is covered by a blanket.
EETE: Who is leading the way? Will giant tech companies change the game through massive investments? Are any challengers arising?
Boulay: The leading players are Continental, Bosch, Hella (now Forvia), Denso, Aptiv and Veoneer. These six players have ~80% of the market. New players are rising, like Arbe, Vayyar, Uhnder and Metawave. They are focusing on the development of imaging radars that can provide a better angular resolution and, therefore, a better perception of objects.
EETE: Are you seeing interesting technology synergies?
Boulay: In addition to integration, there has been fundamental evolution in RF performance. Initially, radars operated with analog beamforming and mechanical steering before moving to digital beamforming (full-scene illumination). In the digital beamforming era, MIMO techniques have been introduced, increasing the virtual aperture while keeping a reasonable physical size. Currently, MIMO scaling is offered by all players in the industry (from 0.2k to 2k pixels) to achieve below 1° angular resolution, 4D imaging radar. Still, the form factor and the cost matter; thus, we expect a sweet spot to emerge. With that, the evolution of RF in radar is likely coming to an end, unless frequencies beyond 77 GHz are used. In next-generation radars, the focus will shift toward radar signal computing. That is why the industry has started investigating machine learning and AI algorithms on radar signals.
LiDAR
EETE: According to Yole Intelligence, the automotive LiDAR market for 2027 is expected to reach US $2.0 billion, up from US$38 million in 2021. What emerging trends do you see in LiDAR technologies?
Boulay: First of all, the type of components inside LiDARs is changing. For example, mechanical scanning LiDARs that use EELs, APDs and an FPGA for processing are progressively transitioning to VCSEL, SiPMs and ASICs. These changes are improving the range, the resolution and, therefore, the data quality. Regarding the LiDAR types, there is also a transition from hybrid solid-state LiDAR with a mechanical moving part to pure solid-state LiDAR with no mechanical moving parts. In the next five to 10 years, we also expect the emergence of optical-based LiDAR using the FMCW [frequency-modulated continuous wave] principle for ranging and optical scanning.
EETE: Who is leading the way? Are Chinese players pushing?
Boulay: In the LiDAR market, few players are really able to deliver products to OEMs.
Valeo is a clear leader and has been in mass production since 2018 with Audi. Behind Valeo, we find Chinese companies, such as Hesai and Robosense. These three players dominate the market today, but others such as Innovusion, Continental and Huawei are also delivering LiDAR in lower volumes. Luminar and Innoviz are still not in mass production.
EETE: LiDAR was the dominant category in terms of financing rounds and the amount of capital raised in 2020 and 2021. How is it today?
Boulay: Investments in LiDAR companies totaled more than US$2.6 billion in 2021 but were more than 10 times lower in 2022. We think that investors are looking more carefully at where to invest. They want to invest in the right startup with technology that can be produced in high volume at low cost while meeting automotive requirements. At the same time, investors are looking for companies that can deliver LiDARs to OEMs and avoid the false promises and dreams sold by some LiDAR companies in the past.
EETE: Are we going to see consolidation in that space?
Boulay: Yes, clearly. If we look at the camera and radar market for automotive, four to five players represent 75% of the market. In the mid- to long-term, we expect similar dynamics in the LiDAR space. Consolidation between LiDAR manufacturers like Velodyne/Ouster will continue, or Tier-1 companies will acquire LiDAR companies. Others will die or change their focus to industrial applications, for example, to generate revenue more rapidly.
EETE: What are we still missing with perception sensors?
Boulay: There is a clear need to reduce the cost of LiDAR sensors to implement automated driving features, but the cost is also linked to volume. Today, volumes are too low, and higher volumes are essential to significantly reducing costs. Also, more development will be needed to enable automated driving in difficult weather conditions. Recently, Waymo published a video in which robotaxis could drive autonomously without a security driver under rainy conditions. Therefore, it is possible, and more work is needed by automotive players. It will be linked to the sensor hardware and the associated perception software.