ADASAutomotive CybersecurityAutonomousConnected DriverConnected VehicleDriver Behavior

Autonomous cars and the legal question

Published: December 10, 2015

The term “Autonomous Vehicle” was newly defined in the Motor Vehicle Management Act, which was amended on August 11, 2015. Pursuant to the new provisions on autonomous vehicles, the issuance of permits for temporary operation of such vehicles became possible for test runs and research purposes.

Moreover, there was a driving demonstration for autonomous vehicles in the “Future Growth Engine Challenge Parade,” which was jointly hosted by the Ministry of Science, ICT and Future Planning, the Ministry of Trade, Industry and Energy and the Ministry of Land, Transport and Maritime Affairs on November 22, 2015. As the government disclosed its ambition to reach commercialization of autonomous vehicles in 2020 by subsidizing the development of autonomous vehicles and establishment of the necessary infrastructure thereof, it is imminent that autonomous vehicles will become part of our daily lives.

Honda_self-driving_car_autonomous

While there is no uniform definition, an autonomous car has been defined as “a vehicle enabled with artificial intelligence and technology that have the capability of operating or driving the vehicle without the active control or monitoring of a natural person” by the consortium of universities funded by the EU working on the “RoboLaw” project.

With this fact comes another perspective too. The autonomous cars would also have to face serious legal concerns like intellectual property rights in connection with the Vehicle Position Indicators (VPI) and Directive technologies;  personal and/or location information pursuant to the collection and use of (personal) location information; Internet of Things related to the communications between autonomous vehicles; and other legal issues in connection with driving license, operational issues, accident liability and insurance of autonomous vehicles.

If an accident was caused by an autonomous vehicle, whether the owner or driver of the vehicle has to burden the driver’s liability pursuant to the Guarantee of Automobile Accident Compensation Act, even if such owner or driver was not actually involved in the driving of the vehicle, can become an issue. In many countries it is mandatory to have a driver in a vehicle( Vienna) so that the law can govern all its actions. If autonomous cars become a common notion, laws like these will either have to be modulated or completely changed. For example, if an autonomous car is found guilty of an accident, the law would have to penalize the “owner” instead of the “driver”.

Another problem that arises with such cars is of data protection. For every autonomous car, the car manufacturer has to gain access to the data of the owner which, in some cases, is against the law. With increasing automation and possibly more product liability claims ensuing from (less) accidents, one encounters a conflict between liability and data protection law: the interest of the driver/owner to data privacy versus the manufacturer’s need to access data in the car. The crunching of this big data is used to improve the experience for the autonomous car.

In order to provide for even partially automated driving, GPS data, internet access and data about the cars condition are needed, as well as to defend against said claims. These are currently not accessible under most countries data protection law, as they are counted to the personal sphere of the driver/owner. Therefore, the necessity of softening data protection law or retaining such an approach may become a prerequisite for any form of higher automation and in order to defend against unjustified liability claims directed towards manufacturers.

There would be the necessity to change another law that currently governs the driver in many countries. Critical event control is the ability to take decisive, evasive and/or precautionary action on the occurrence of an event that may lead to injury to people or animals, or damage to property. It is separate and distinct from navigational control because, on the occurrence of a critical event, getting to your destination (the focus of navigational control) becomes of secondary importance.

At this point, the vehicle needs to take action so as to prevent or minimise injury or damage that may occur as a result of the critical event. This action is currently assumed to be the responsibility of the driver but in due course, for autonomous vehicle use to be widespread, it will need to become the responsibility of the system.  But the experts are of the opinion that the human brain-eye coordination is much faster than any machine, which means that a human driver can react more promptly in case of detection of an animal or even a pedestrian.

So, is it better to change the law and make the owner responsible for critical event control? Experts are not in the favour of this. They believe for this kind of control, a human is much better than a machine. This kind of comparison between a human driver and a self-driven car with respect to critical event control is also not possible on the public roads. The experts believe this comparison is too dangerous to be carried out on a public road in real situations.

This disables the law to distinguish between critical event control and automated cruise control. Also every time in case of a critical event, the situation is different. How will the law be able to decide whether the self-driven car was at fault or not in the accident? A machine cannot be taught to react differently in every situation. This renders self-driven cars little less favourable than the human-driven one.

Another issue will be the issuing of driving licenses. Or would they be needed at all? How will someone be decided fit to operate an autonomous car? This would again need the formulation of new set of laws to govern the driving in autonomous cars. People are generally taught how to drive from A to B. They are taught to steer a vehicle and to brake and accelerate so as to navigate the road network. Humans are rarely given any kind of emergency scenario training. Some nations require a degree of skid training but this is the exception to the rule. Humans tend to need to: be over the age of about 16/17/18, pass a theoretical test on the rules of the road and pass a short practical test (at best involving an emergency stop). The human is then free to drive a metal box at speeds of up to 70 miles per hour (or more) towards other people (either in other metal boxes or just walking by the roadside).

To conclude, there has to be certain set of modulations in the legislature for autonomous cars to be fully functional. The current laws aren’t sufficient to govern the changing scenario of traffic.

By Kriti Ranjan

Tags

Related Articles

Check Also

Close