What’s in Your Sensor Fusion Toolbox?

July 25, 2017

sensor fusion toolbox creates embedded software

Fusion is the art of combining two (or more) disparate items and creating something that is greater than the sum of its parts. Like kimchi tacos. Or chocolate nachos. Or Chinese pizza. Or, in the case of advanced driver assistance systems (ADAS) and autonomous vehicles, radar + Lidar + camera + GPS + gyroscope – that is, sensor fusion.

By 2025, the percentage of new vehicle platforms with sensor-fusion modules will grow from 4 percent to 21 percent. Over the same time period (2015 to 2025), the sensor-fusion-module market is expected to experience a 20-percent compound annual growth rate (CAGR) -- one of the highest growth rates for automotive components.

Sensor fusion combines data to increase accuracy and decrease response time, thereby improving vehicle safety. There are several types of sensor fusion algorithms, including:

Complementary. In this scenario, each sensor’s data is incomplete; combining their data can provide a more comprehensive result. For example, fusing images from both front- and rear-facing cameras gives a better description of a car’s environment than one or another camera alone.

Competitive. In this scenario, several sensors independently provide measurements of the same thing. This type of redundancy is critical in development of ADAS and autonomous vehicles.

Cooperative. This is the most complex scenario, in which several sensors’ data is used to derive information that cannot be gained from individual sensors. For example, “9-axis sensor fusion” involves integrating data from the 3-axis earth magnetic field, 3-axis linear acceleration, and the 3-axis angular rate.

As ADAS and autonomous vehicles become more sophisticated and safety regulations proliferate, the way embedded software solutions handle integration and fusion of sensor data is a growing concern among developers. Part of the solution lies in merging individual electronic control units (ECUs) into domain control units (DCUs) – these complex printed circuit boards (PCBs) assist in gathering and routing sensor data. Additionally, robust sensors and high-performance, multi-core microprocessors designed for automotive applications are required. But equally important as the underlying hardware are the software development tools used to generate the embedded software applications that transform raw sensor data into meaningful, actionable information.

Best practices, such as choosing a compiler, linker, and debugger that are ASPICE-certified and making sure software development tools are tightly coupled with the hardware to maximize performance and safety, can result in embedded software that can handle sensor-fusion tasks such as cross-traffic assist and autonomous obstacle avoidance. Certified libraries that are highly optimized for the target architecture can make it easier and faster to develop embedded solutions. For example, the time-critical sections of ADAS code often involve complex array manipulations involving linear algebraic operations. An optimized LAPACK library can provide as much as an order of magnitude better performance than is possible using open-source offerings or in-house implementations.

Tools for Success

By itself, each sensor in a car (such as radar, cameras, ultrasound, and Lidar) has its limitations. By combining the input from various sensors, individual systems complement each other and can achieve enhanced ADAS functions. As part of cutting-edge autonomous driving systems that can make critical, autonomous decisions, sensor-fusion applications must be designed to meet the highest safety and security standards and must reliably provide exceptional performance. Choosing the right hardware and software development tools can result in efficient development of robust sensor-fusion embedded software applications.

Learn more by reading the white paper, “Software Plays a Crucial Role in Sensor Fusion.”

most recent articles

Back to Home