Autonomous Vehicle

Sensor Fusions: The key to unlock the future of Autonomous Cars

Eventually, the goal of an ideal autonomous (better known as self-driven) car is to replace human drivers. To become a perfectly safe and perhaps a more efficient “high-tech” substitution! In essence, this means that level 5 or “fully autonomous” cars will be able to sense their surrounding, make sense of it, and then navigate safely with little or no human intervention.  

When humans drive, they deploy and heavily rely on their sensory organs. Think of your eyes as visual sensors that continue to collect data from the environment. Say it perceives another car, or a pedestrian, or an approaching speed bump, and transmits that data to the brain. Your brain interprets those electromagnetic waves, perceives relative distance and speed, and then relays an appropriate response to your arms and legs. Say if you are too close to a pedestrian, your bodily sensors detect it, your brain creates an appropriate response—so you can hit the brake pedal and avoid an impending collision. 

Sensors and Sensor Fusion

For an autonomous vehicle to achieve this, it needs sensors and sensor fusions. 

The key difference between a non-autonomous and an autonomous car is the absence or presence of sensors and in-vehicle technologies. Autonomous vehicles, like human drivers, depend on sensors to perceive their environment. Sensors like cameras, Lidar, Radar, Sonar, GPS, IMUs can be used to make proficient autonomous cars. They collect data and pass it on to the sensor fusion, typically via MIPI (mobile industry processor interface). 

For instance, radar sensors transmit radio waves and calculate the time taken for that particular wave to return—thus calculating the distance between the car and the obstacle. This returned radio wave can be thought of as sensory input and needs to be passed on to the sensor fusion.

 What does sensor fusion do then? It combines delivered data, or in our case this radio wave signal, from disparate sources to create a coherent output. Sensor fusion plays a pivotal role in reducing uncertainty, as it compares signals from multiple sensors. Joint information reduces ambiguity, making the system less vulnerable to interference or collision. 

While sensors fusion works just fine with a single sensor, they are much more effective with multiple sensor integration. Here, sensor fusion may take multiple measurements subsequently at different instants—to get a more definitive perception. Unfortunately, a single sensor isn’t able to reduce uncertainty because of its obvious limitations. On the other hand, multiple sensors can enable the system to provide certain information, even in the case of partial failure, like a camera malfunction or a jammed transmitter. 

Partly, why this happens because sensor fusions work along with the tenets of Conditional Probability. Conditional probability is a ground-breaking concept in probability theory, and it helps us understand the probability of an event occurring (for convenience, let’s call this event A) in relation to one or more probable events (say, event B). Simplifying, what are the chances of our event of interest A happening if event B did (or did not) take place. 

Sensor fusion, after compiling data from multiple sensors, allows room for the vehicle to make a decision. Say the camera as well as a radar detect a nearby object—like a passing pedestrian that is six meters away. Sensor fusion does a couple of things here: it makes sure that this observed pedestrian is exactly six meters away by eliminating “noise” from accurate data. If that condition is true, it calculates how much the vehicle needs to slow down in order to avoid a collision. Then, it passes on the information to vehicle control, eventually leading to breaking or even stooping depending upon the velocity of the pedestrian and the car. 

Concluding Remarks

Sensor fusion allows merging data from multiple sources and dramatically reduces the level of uncertainty. For safer autonomous cars, we require impeccable sensors and a state-of-the-art sensor fusion system. Sensor fusion attempts to replicate the functionalities of the human central nervous system. In fact, sensor fusion can compensate for the deficiencies with sensors—by identifying sensor failures or accuracy limitations. Very similar to human memory, sensor fusion can also use historic data points and use them to make befitting decisions. Indeed, autonomous cars are the future of transportation, and sensor fusion is the key to this exciting, limitless future!

References:

https://www.researchgate.net/profile/Jelena-Kocic/publication/329153240_Sensors_and_Sensor_Fusion_in_Autonomous_Vehicles/links/5c6c65c692851c1c9dee9030/Sensors-and-Sensor-Fusion-in-Autonomous-Vehicles.pdf

https://www.researchgate.net/profile/Wilfried-Elmenreich/publication/267771481_An_Introduction_to_Sensor_Fusion/links/55d2e45908ae0a3417222dd9/An-Introduction-to-Sensor-Fusion.pdf

Author

Tushar Bhagat
Director
Uffizio India Software Pvt

Published in Telematics Wire

Back to top button