How to achieve the best Sensor Fusion for ADAS/AV vehicles?

Vehicles equipped with Advanced Driver Assistance Systems (ADAS) and Autonomous Vehicles (AV) use different sensors like Cameras, Radars, Lidars, and Ultrasonic sensors which act as the eyes of an autonomous vehicle (AV) and help the vehicle perceive its surroundings, i.e., people, objects, traffic, road geometry, weather, etc. This perception is critical, and it ensures that the AV can make the right decisions, i.e., stop, accelerate, turn, etc.

As we reach higher levels of autonomy in autonomous driving, the complexity increases substantially. Here multiple sensors are required to understand the environment correctly. But every sensor is different and has its limitations, e.g., a camera will work very well for lane detection or object classification. In contrast, a radar may provide good data for long-range detection or in different light conditions.

Capabilities Camera Radar Lidar
Long range detection Average good Average
Different light conditions Average Good Good
Different weather conditions (light rain, fog) Poor Good Poor
Object classification Good Poor Good
Stationary object detection Good Poor Good

Table 1 – Sensor capabilities

Using Sensor Fusion techniques, data from multiple sensors is fused in Autonomous Vehicles to provide the best possible input so that the AV takes the correct decisions (brake, accelerate turn, etc.)

Sensor Fusion improves the overall performance capability of an Autonomous Vehicle, and there are multiple fusion techniques and which one to use depends on the feature’s Operation Design Domain (ODD). Some examples for the different types of fusion techniques are mentioned in the original article.

The complexity of the environment and features specifications drive what kind of fusion strategy and what type of fusion requirement needs to be worked out. Executing Sensor fusion is a complex activity, and one should account for numerous challenges. Do have a look at article to understand the eight key challenges faced during sensor fusion.

To ensure high levels of accuracy with Sensor Fusion, one must ensure the right KPIs and robust design is being used, it includes

  • Ensuring that all high-level KPIs are in place
  • Achieving Higher accuracy through design
  • Data Synchronization
  • Coordinate Transformation
  • Validation Gating and Data Association
  • State estimation and predictions
  • Track Management

The article also speaks about some key aspects of design: Selection of algorithms for data association and estimation technique, fusion strategy, Track Management, and Filter tuning.

It is also essential to validate fusion to ensure software quality and scenario covers. However, validation can lead to millions of scenarios, so it is always important to use the right validation strategy to test fusion with Simulated data, real-world data, or vehicle testing.

So fusion is a critical perception component for AD performance. One must consider key practical aspects and define the right strategy for design and validation to achieve the highest AD software maturity level.

At KPIT, we have been working on fusion for the past seven years. We have invested and developed a very robust design and validation strategy to ensure the highest level of software quality. The design takes care of variant handling for multi-sensor fusion (radar+radar, Camera+Radar, Camear+Radar+Lidar), different sensor topology and layout, sensor characteristics, and sensor degradation. We have filed multiple patents for various fusion techniques. We have been supporting multiple OEMs and Tier-1 customers for various fusion projects for different model year programs.


Related Articles

Software Industrialization is the next step in a Commercial Vehicle’s Autonomous Driving Journey

Software Industrialization is the next step in a Commercial Vehicle’s Autonomous Driving Journey

System Engineering Plays A Key Role In Autonomous Vehicle Journey

System Engineering Plays A Key Role In Autonomous Vehicle Journey

Autonomous Driving and Artificial Intelligence An Approach to Achieve Functional Safety

Autonomous Driving and Artificial Intelligence An Approach to Achieve Functional Safety

Your feedback form has been submitted successfully!