Advanced driver-assistance systems (ADAS) in electric vehicles (EVs) rely on sophisticated sensors to rapidly collect and analyze petabytes of real-time data. This article highlights popular EV ADAS features, ranging from Tesla’s autopilot and full self-driving (FSD) to night vision and autonomous parking. It also discusses the essential role sensors play in various EV ADAS applications, such as cameras (machine vision), LiDAR, and radar.
Spotlighting Tesla’s autopilot and full self-driving
Tesla’s ADAS supports a comprehensive set of active safety features, including obstacle-aware deceleration, automatic emergency braking, blind spot monitoring, and collision warnings. Tesla EVs also offer advanced features such as autopilot, which uses traffic-aware cruise control to adjust speed and autosteer to maintain lane position on crowded city roads and rural highways.
Expanding on autopilot, Tesla’s FSD (Figure 1) introduces additional autonomous ADAS capabilities. For example, navigating on autopilot and auto lane change efficiently guide EVs from highway on-ramps to off-ramps. When paired with traffic and stop sign control, autosteer identifies and responds to dynamic road conditions, obeying signals and signs while avoiding pedestrians, vehicles, and traffic cones. Lastly, auto park automates parallel and perpendicular parking, while smart summons deftly navigates parking spaces, garages, and driveways.
Notably, Tesla’s ADAS leverages a combination of advanced sensors and machine vision to operate safely and efficiently in various traffic and weather conditions.
Improving driving safety and convenience
Although not as comprehensive as Tesla’s FSD, many EV ADAS improve driving safety and convenience with some — or all — of the features detailed below (Figure 2).
- Cruise control and autopilot: Helps drivers maintain consistent speed and lane discipline while monitoring surrounding traffic to prevent collisions. Advanced cruise control and autopilot systems automatically accelerate, decelerate, and stop in response to traffic signals, vehicles, objects, or pedestrians.
- Autonomous or assisted parking: Guides drivers by incrementally turning the steering wheel and moving forward or in reverse to fit into tight spaces on city streets or in crowded parking garages.
- Automatic emergency braking: Identifies potential collision scenarios, alerts the driver, and activates braking systems. Advanced ADAS can take additional preventive actions, such as incrementally reducing speed or engaging adaptive steering to avoid accidents.
- Crosswind stabilization: Maintains stability by detecting and counteracting crosswind pressure, dynamically adjusting wheel motor control units (MCUs) and braking systems.
- Navigation: Provides on-screen instructions and voice prompts, allowing drivers to follow planned or dynamic routes while keeping their eyes on the road. Most systems display real-time traffic data and suggest alternative routes to avoid congestion.
- Adaptive light and beam control: Adjusts headlights at night or during inclement weather. These systems detect the intensity of other vehicles’ lights, modifying headlight strength, direction, and rotation accordingly.
- Night vision: Boosts visibility in low-light conditions, with active systems projecting infrared light and passive systems analyzing thermal energy from other objects.
- Blind-spot monitoring: Alerts drivers when objects are detected in traditional blind spots, including the areas directly behind the vehicle and rear sides.
Importantly, EV ADAS uses sensor fusion to efficiently analyze, infer, and act on data from various systems and components.
EV sensors: From video cameras to LiDAR
Machine vision sensors embedded in EV ADAS video cameras create extensive three-dimensional models of the vehicle’s surroundings. For example, Tesla’s Vision autopilot system relies extensively on a network of cameras (Figure 3) for traffic sign recognition, object detection, and obstacle avoidance.
EV cameras typically employ CMOS or CCD sensor technology. Although CMOS is considered more power-efficient and cost-effective, CCD offers improved dynamic range and resolution. Night vision cameras use specialized sensors, typically CMOS, which augment visual data with other systems like radar, effectively minimizing issues caused by rain or snow.
Typically operating at frequencies between 24 and 79 GHz, sensors in ADAS radar systems help detect large objects in front of EVs. These sensors measure the time for radio waves bounce back from objects, allowing radar systems to accurately determine object distance, size, and relative speed. EV radar systems are particularly useful for high-speed driving on highways, as they detect objects up to 300 meters away and continue to function in most adverse weather conditions. EV radar systems support collision warning, adaptive cruise control, blind-spot detection, and lane-change assistance features.
In contrast, ADAS sonar (or ultrasound) generates high-frequency sound waves to detect objects close to EVs. Sonar sensors (Figure 4) measure the reflections of these sound waves to accurately determine the distance and location of nearby objects.
Sonar is commonly used in backup detection and self-parking systems, with sensors located on vehicles’ front, rear, and corners. Ideal for low-speed applications, sonar has a limited range of about 8.2 to 14.76 feet (2.5 to 4.5 meters) compared to radar’s range of 984 feet (300 meters).
Meanwhile, ADAS LiDAR uses laser beams and sensors to capture billions of points up to a range of 300 meters. This combination creates detailed, real-time maps of the vehicle’s surroundings with high-resolution 3D models known as point clouds.
Although affected by fog and rain, LiDAR systems, which can include up to 128 lasers, provide higher precision than radar or sonar. Delivering high accuracy and detail, LiDAR, which is integrated into Volvo’s EX90 (Figure 5), is expected to increasingly augment EV cameras to improve obstacle detection and autonomous navigation.
Lastly, EV ADAS Global Navigation Satellite System (GNSS) uses sensors within GPS systems to facilitate accurate positioning. While standard consumer GNSS offers meter-level accuracy, augmented systems can achieve centimeter-level precision, critical for assisted and autonomous driving. In areas where GNSS signals are obstructed, such as tunnels or parking garages, sensors embedded in inertial measurement units (IMUs) provide dead reckoning capabilities until GNSS signals are restored.
Summary
Sophisticated sensors embedded in EV ADAS rapidly collect and analyze petabytes of real-time data. Tesla’s ADAS, for example, relies on a combination of cameras and sensors to enable a comprehensive set of active safety features, autopilot, and FSD. Other key EV ADAS components include LiDAR, radar, sonar, and GNSS. All these systems use sensor fusion to efficiently analyze, infer, and act.
References
- Autopilot and Full Self-Driving Capability, Tesla
- Tesla Vision Update: Replacing Ultrasonic Sensors with Tesla Vision, Tesla
- Types of ADAS Sensors in Use Today, DEWESoft
- 5 Types of ADAS Sensors You Should Know About, HESAI
- ADAS Sensors Guide, CARADAS
- ADAS Radar Sensor Guide: Automotive Radar and How it Works, CARADAS
- ADAS: Everything You Need to Know, CarAndDriver
- What is ADAS?, Synopsys
You may also like:
Filed Under: FAQs