Sensor fusion uses multiple sensor technologies in a single application. Examples include location and object detection, combining cameras, LiDAR, ultrasonic and radar sensors, and distributed voltage, temperature, and current sensors in electric vehicle (EV) battery packs. Sensor fusion can improve system performance but adds to energy consumption.
This FAQ looks at common uses of sensor fusion in EVs and autonomous vehicles and closes by considering the impact of sensor fusion on vehicle energy consumption.
As seen in Figure 1, a wide range of sensors can be found in an EV battery management system (BMS), including temperature sensors, potential (voltage) sensors, current sensors, gas sensors, and pressure sensors. In various combinations, the data from those sensors can be used to improve state of charge (SOC) estimates, state-of-power (SOP) estimates, state of energy (SOE) estimates, and state-of-health (SOH) estimates.
This information supports improved thermal management and cell balancing, optimization of charging and discharging, preventative maintenance, and enhanced safety.
Examples of the application of sensor fusion in the BMS include:
Improved SOC estimation. Combining voltage and current measurements for individual cells and modules with temperature data enables the development of more comprehensive and accurate SOC estimation algorithms.
Improved thermal management. Merging and analyzing data from multiple temperature sensors physically spread throughout the battery pack enables the identification of dynamic temperature changes and supports improved thermal management.
Optimizing charging and discharging. Monitoring the performance of individual cells using sensor fusion for each cell enables the BMS to support cell balancing by optimizing charging and discharging. Preventing cell imbalances is essential for maximizing cell and battery pack lifetimes and minimizing safety concerns.
Enabling predictive and preventative maintenance. The availability of improved SOH estimates for cells and battery modules can allow the BMS to predict SOE and SoP performance capabilities and identify needs for preventative maintenance.
Sensor fusion and perception
Merging data from cameras, LIDAR, and radar can improve the perception and understanding of the surroundings for autonomous vehicles. Basic perception models use the sensors separately and can’t leverage the power of sensor fusion.
Perception applications can be implemented based on raw data fusion or object-level fusion (Figure 2):
- Raw data sensor fusion takes the data from the individual sensors into the fusion and perception engine and combines the streams of raw data in a single perception model. Developing a single perception model results in a more robust system with fewer false classifications.
- Object-level sensor fusion operates on the data from the individual sensors to identify objects, and the multiple objects are fed into the fusion process to develop an environmental model. In this case, the individual sensor perception models can generate incomplete or contradictory outputs that result in increased misclassifications and errors.
Sensor fusion and power consumption
Maximizing the benefits of sensor fusion requires using large numbers of sensors. Those sensors can consume significant amounts of power and reduce the driving range of EVs. In the case of autonomous vehicles, sensor fusion is expected to improve driving patterns, at least partially offsetting the sensors’ power consumption and helping maintain the driving range.
One study used simulation to estimate the impact of sensor fusion power consumption on EVs. It considered two use cases:
- The EPA Urban Dynamometer Driving Schedule (UDDS) is for city driving of light-duty vehicles like automobiles.
- The Highway Fuel Economy Test (HWFET) is for long-range highway driving of light-duty vehicles like automobiles.
The simulations ranged from no sensor fusion energy consumption to energy consumption in the range of 2 to 4 kW, which is representative of existing autonomous driving test vehicle systems. The reduction in driving range was up to 38% for a sensor fusion system that consumes 4 kW as measured using the UDDS schedule. The test vehicle systems with high energy consumption generally use distributed sensor fusion systems and object-level perception.
The development of centralized sensor fusion systems based on raw data fusion is expected to drop the energy consumption to as little as 100 W. That would make a significant improvement in driving range (Figure 3).
Summary
Sensor fusion is essential for improving the performance of BMS and perception systems in EVs and autonomous vehicles. There are various ways to implement sensor fusion. The energy consumption of sensor fusion systems is an important consideration and must be minimized to reduce the negative impact EV driving range.
References
- A Review of Sensor Applications in Electric Vehicle Thermal Management Systems, MDPI energies
- Design challenges and opportunities for electric powertrain with vehicle autonomy, Siemens
- How Sensor Fusion Can Enhance Battery Management System, Enrgtech
- Sensor Fusion and Perception, Leddar Tech
Images
- Figure 1, MDPI energies, Page 12, Figure 9
- Figure 2, Leddar Tech, Fusion of Figures 1 & 2
- Figure 3, Siemens, Page 3, Figure 1
You may also like:
Filed Under: FAQs, Sensors