Autonomous Vehicle

Secret ingredients of autonomous vehicles

A hot topic these days in automotive research is autonomous vehicles. Almost all automotive companies are racing either jointly or independently with each other to dominate the market by producing them. Nevertheless, winning the race is not as much easy as anybody thinks due to many challenges to overcome from research to market.

Challenges

Investment

The companies have to invest considerably high amount of research since the ratio of resource to final product is higher unlike in production which forces the companies for long-term investment planning which is very unlikely to get back the investment soon in return with margin of profit.

Patience and perseverance

The teams working on it require a lot of patience and perseverance as there will be times when some of the researches are successful, some take longer than usual and sometimes do not succeed. The companies have to stand with the teams and encourage which otherwise may lose all the efforts of research carried out so far. Nevertheless, at the time of a crisis, it would be difficult for the companies to get the research going.

Market Readiness

The market has to be ready to accept the latest technology in vogue and put it to use. May it be MaaS (mobility as a service), or ownership, how do we convince the customer to take a ride assuring that it is safe? How do ride-hailing companies assure the same to the customer? How do we convince the owner who enjoys the ride who is very much passionate about driving the car? However, FaaS (freight as a service) might not face much difficulties as it might not pose a threat to human lives riding within the vehicle.

Lack of general intelligence

The trolley problem that arises in most of the common people’s mind, the answers are dependent on each individual’s psychology. With such a scenario, it is difficult to train and derive an expert system to make decisions at a sickle of time that is morally correct. For example, think of an ambulance shifting a critical patient to a hospital downhill on a hilly terrain. Suppose a pedestrian is crossing the road, avoiding pedestrians might plunge the ambulance down the cliff. On the other hand, hitting pedestrians may cause death on the spot. Should the ambulance driver kill the patient or the pedestrian?

Liability

In case of any accidents, if the vehicle driver is said to be liable, the vehicle was never driven by a human. If the service provider is said to be liable, who never developed the system. If the manufacturer is said to be liable, who would have not had the data of scenario for which the vehicle was not developed/tested. Same goes with traffic rule violations as well.

Infrastructure

Infrastructure has to be geared up to handle mechanical breakdowns like towing services or nearby repair stations to where the vehicle can drive itself. Nonetheless, software updates can be pushed over the air with the help of recent technologies. 

Dilemma

The society may be in a dilemma for accepting the advanced developments in technology with their implications. One such example is the doubt of loss of jobs for taxi drivers.

Security and safety

There are many possibilities when the security of the autonomous driving system can be compromised. One of the good examples is hacking as shown in “Fate of the furious” and other movies thereby putting the lives of riders at risk. There is also a possibility that the autonomous vehicle can get stolen or crashed somewhere making it nearly impossible to track. A nightmare would be holding the vehicles hostage and demanding ransom, unauthorized collection of customers’ pickup and drop off points to list a few. With the above facts said, a good Intrusion Detection and Mitigation System, in-vehicle SoS, along with a BlackBox like the aeroplanes have, and other safety and security systems have to be designed.

With all these challenges, some automotive companies are researching on SAE Level 4 Autonomous Vehicles which are developed robustly, tested rigorously and deployed to run within a geographically fenced area under controlled environments and concentrate on plunging into the market through ride hailing as a first step on deploying their vehicles on roads. While some others are working on Adasis V3 which enables the concept of Mapping and Localization for SAE Level 3 Autonomous Vehicles where the vision of sensors are extended through the help of a map and the vehicle’s position thereby making their selling products more robust and safer. 

J3016 Levels of Driving Automation
The above table shows different levels of autonomy [1]

SAE Level 3 Autonomous Vehicles are an application of Braitenberg vehicles which exhibit reactive behaviours. These vehicles are incapable of making decisions on their own as they lack the ability of “planning” (as shown in the figure below). A good example of this is a line follower robot that students start their robotics learning with. SAE Level 3 Autonomous Vehicles take actions based on what is perceived from lane keeping to emergency braking. Driver has to take the vehicle into control when the system is requests.

The figure above shows steps taken up by SAE Level 3 Autonomous Vehicles.

SAE Level 4 Autonomous Vehicles are an application of Mobile Robotics which sense, plan and act (as shown in the figure below). These vehicles will have a dedicated planning module to command stop, steer and throttle the vehicle based on the environment perceived.  This requires the vehicle to be aware of the environment in which it operates and its current position in the environment so that it can think where and how it should navigate and drive to reach the destination safe and cost effective. This means the vehicle can be driven only within a geographically fenced area. This can be compared to robotic technology where, robots work in hostile and dangerous known environments lessening the burden and risk of human beings.

The figure above shows steps taken up by SAE Level 4 Autonomous vehicles.

Adasis V3 makes the SAE Level 3 Autonomous Vehicle prepared via HDMaps for taking actions ahead of the detection through vision sensors. Also, in case of other vehicles covering signs or the glare on cameras from Sun during morning or evening, the map can also act as a complimentary sensor for taking action, provided the sign is tagged geographically in the map and validated. However, there comes another challenge of validating the map as there could be environmental changes since when the environment was mapped.

Mapping and Localization is a concept borrowed from Robotics where the vehicle first learns about its surroundings to plan “How should I reach the destination?” the next question to be answered will be “Where am I now?”. In addition to this, maps can also make the vehicle turn on specific vision algorithms to search for landmarks and sign boards based on data from the map when the vehicle enters the vicinity.

Mapping and Localization is a chicken and egg problem because, in order to answer “Where am I?” vehicle needs to know its environment which is on a map. In order to create a map based on the sensor measurements perceived from the environment, vehicle needs to know how far it has travelled from a landmark that it perceived before. During this course, there could be errors from the sensors due to various reasons. For example, the environment does not have a good illumination which makes the data from the camera erroneous. Thus, the system has to rely on wheel rotation and magnetometers which gives birth to another set of errors when robot skids. The problem starts when the system has to choose between sensors to trust more so that the data perceived is more accurate without the intervention of a human. This will be solved using the probabilistic estimation and data assimilation. 

This necessarily should not mean that each and every vehicle has to have a local copy of maps of the geographical area where it drives. With the recent technology advancements, the AVs have to be able to swiftly pull a part of a map covering a couple square kilometres of area from its dedicated server which should suffice to work in the advent of network connectivity issues. 

Updating the map information in case of environmental changes can also be crowdsourced like how Google has made the information of nearby places available on our fingertips. Nevertheless, this whirls a question in mind to validate the crowdsourced data based on which the AV has to be driven.

Having the environment and position of vehicle through sensing, the plan has to be developed during the mission which happens in two stages. First will be the global path planning which plans the route to destination choosing which roads to travel on. The next comes the local path planning with obstacle avoidance, over taking, etc while following the traffic rules which are achieved using many ways. Some of the indoor mobile robots which operate at slow speeds having the luxury of on axis rotation implement carrot planner, where the goal point is placed dynamically in front of the robot like it happens with carrot and donkey which keeps the robot self-motivated to catch-up to goal till it reaches the destination. However, this does not suffice for autonomous vehicles which require implementations like curve fitting and parabolic blend because of the fact that an object cannot attain a set velocity instantly but achieve a smooth increase in the same through acceleration and cars cannot perform an on-axis rotation. 

Once the planning is complete, then comes bringing the plan to action. The vehicle has to take actions to rotate the steering and increase or decrease brake and throttle based on the Ackermann kinematic model which is generally used in all cars in the automotive industry irrespective of FWD (front wheel drive) or RWD (rear wheel drive).

The entire pipeline of making an autonomous vehicle function is an majorly an application of engineering mathematics that students study during their course like linear vector transformation, vector decomposition, coordinate geometry, trigonometry, fourier transforms, differential equations, integral equations, probability, euler angles, model fitting, polynomial fitting, parabolic blend, real numbers, eigen vector and eigen values to name a few. All these concepts when implemented together using programming languages, gives birth to a new startling product like these. 

Safety is an utmost concern in autonomous vehicles which has to be borrowed from aerospace. For example, an engine failure during the flight of an aircraft is never acceptable. Similarly, the software that runs the autonomous vehicle must never fail/crash; in fact, even a couple of milliseconds of lag could be fatal. Designing a safety system in such a way that the software malfunction/crash is detected and the vehicle at least pulls over to a side of the road and stops safely in degraded mode is still a big challenge. 

Open-source platform

It so happened in the field of robotics, the ideas used to originate from colleges/universities. The young student minds with a strong desire to implement something new, start to develop a robot with the available resources which I too experienced back in my student days. Just as the fact is known, student(s) can work very less alongside their studies in spite of many trial and error. As the saying goes, “no two clocks agree”, the next student who would like to continue had hitches in accepting the software/hardware developed previously and integrating to build further. Also, the non-standard documentation procedures make it difficult to understand the intent of the implementations in a program. Due to the facts said, self-motivated and highly ambitious student now starts everything from scratch discarding the previous review of research and development underestimating the problems that transpire and end up at the same state which may not improve further. This was inhibiting the research on robotics since the platform could never be developed further. 

When Robot Operating System (ROS) was made open-source with standard well-defined procedures, the research and development using ROS started and now almost all the robots in the world are powered by ROS. Many companies started using ROS as a platform to develop their Level 4 Autonomous Driving platform. Nevertheless, ROS has a single point of failure in the system which is dangerous to be deployed on an Autonomous Vehicle. Also, each ROS distribution reaches EoL (end of life) which require retesting of all the features whenever there is a distribution upgrade. Some companies built their own “ROS like” platforms while some others have started using ROS2 which is decentralized. Nevertheless, ROS2 itself has not been certified compliant with ISO26262 but some of the applications on it are built to be certified to the highest level of the automotive safety norm ISO 26262 (ASIL-D).

Making Level 4 Autonomous Driving software development platforms open-source has enabled the companies to collaborate in leveraging the technology in industry. This has also given access to industry standard platform on which the algorithms can be developed and benchmarked from academia encouraging the participation and contribution that industry can constantly influence towards betterment. Although autonomous driving technology seems to be smaller in dimensions, when applied it can handle large scale miracles. Just like Android, after it was made open-source; which now powers from coffee vending machine through TVs to smartphones.

Thus, the path taken by automotive companies will have lot of uncertainties and challenges with business as well as the technology platform while developing autonomous vehicles as a product that would not be seen to the world. The last hit hammered to a nail is not the only successful hit. Twenty years ago, most of us had never thought we would have smartphones but, everyone has one today. Autonomous vehicle technology needs constant and continuous participation in research and development from academia as well as the collaboration between automotive and non-automotive industries together to make the vehicle run autonomously in all scenarios and conditions i.e., SAE Level 5 Autonomous Vehicles. 

References:

[1] https://www.sae.org/news/press-room/2018/12/sae-international-releases-updated-visual-chart-for-its-%E2%80%9Clevels-of-driving-automation%E2%80%9D-standard-for-self-driving-vehicles

Author:

Sudhanva started working on Level 4 Autonomous Vehicles in 2016, currently works on Mapping and Localization, Autonomous Driving Systems at GWM’s Indian R&D center, Bangalore, Karnataka. A post graduate with MTech in Industrial Automation and Robotics from NIE, Mysore, Karnataka. His interests are in the field of robotics and autonomous driving. Inclined towards travelling, trekking, classical music (playing mrudangam).

Published in Telematics Wire

Back to top button