24. November 2023 | Magazine:

“Now it’s up to practice to fill in the legal framework and bring it to life” Prof. Anne Paschke on legal issues in autonomous driving

The technological foundations for autonomous driving are well advanced and automated driving functions are being tested in practice. However, the next step towards the widespread use of Level 4 vehicles, the highest level of autonomous driving, has yet to be taken. For the safe use of autonomous and automated vehicles on our roads, the current regulatory framework needs to prove its practicality and new requirements need to be formulated. This concerns, for example, licensing, areas of operation and liability. Professor Anne Paschke talks about the obstacles that still need to be overcome, how legal certainty can be established, and how TU Braunschweig is researching this. She heads the Institute of Law at Technische Universität Braunschweig and conducts research into the law of digitisation, data (protection) law and mobility law, among other things.

The law on autonomous driving has been in force since July 2021. What does the law regulate? What are the restrictions?

TU-Professorin Anne Paschke wurde für fünf Jahre als Sachverständige in die Kommission zur Ermittlung der Konzentration im Medienbereich (KEK) gewählt. Bildnachweis: Max Fuhrmann/TU Braunschweig

TU Professor Anne Paschke was elected as an expert to the Commission on Concentration in the Media (KEK) for five years. Photo Credit: Max Fuhrmann/TU Braunschweig

The 2021 law on autonomous driving complements the amendment to the Road Traffic Act passed four years earlier, which regulated automated driving for the first time. Further levels of automation and connectivity in road traffic are now legally possible, as the law allows the driverless use of motor vehicles in certain local areas (so-called defined operating areas).

At the same time, the law sets clear limits for the operation of autonomous vehicles: firstly, you can only drive your ‘autonomous’ vehicle on designated routes, and secondly, the owner of a motor vehicle with an autonomous driving function cannot simply ‘sit back’. Instead, they are obliged to maintain the safety of the vehicle and ensure that the tasks of a technical supervisor are fulfilled. The driverless driving process must be permanently accompanied by a technical supervisor who can remotely disable and enable certain driving functions during autonomous driving. The role of technical supervisor is again performed by a natural person. It is not yet possible to do completely without humans.

The title of the law is therefore slightly misleading: even under this law, ‘true’ autonomous driving, i.e. purely digital control of the vehicle without human intervention, is not yet permitted in Germany.

Do the law and the Autonomous Vehicles Licensing and Operation Ordinance (AFGBV) still meet the requirements of today’s market and technology?

The new regulations are another (interim) step on the way to a new law for a digital transport infrastructure. The interaction between vehicle manufacturers, IT experts, technical supervisors and the pre-programmed vehicle must first prove itself in practice. However, the law on autonomous driving meets the requirements of the market in that it provides legal certainty as to which technical components manufacturers must develop in order to ensure safe autonomous driving in Germany.

On the subject of liability: Is there legal certainty? Is it the manufacturer or the vehicle owner who is liable? Can technical devices such as a black box provide sufficient assistance in troubleshooting and fault finding?

The liability issues have been clarified on paper, but have not yet been tested in practice. The obligations of the parties involved (vehicle owner, technical supervisor, vehicle manufacturer) in the operation of motor vehicles with autonomous driving functions are now listed in Section 1f of the German Road Traffic Act (StVG). At present, the manufacturer is liable for genuine material defects of the vehicle and the owner for the so-called operating risk of the motor vehicle. One positive aspect of this is that potentially injured parties have several options for obtaining compensation for damages and pain and suffering in a lawsuit. However, it remains to be seen whether the liability system will prove its worth in regular human-machine interaction.

It is also assumed that the liability limits of Section 12 of the Road Traffic Act, which have already been increased for highly and fully automated vehicles compared to conventional vehicles, will also apply to autonomous vehicles. However, how the proportion of errors or misbehaviour in human-machine interaction is to be calculated and assessed is still being researched. Technical means (such as a “black box”) that help to clarify the facts are certainly useful for clarifying responsibility.

However, when discussing the future distribution of liability, we must not forget that in road traffic, even today, due to existing case law on the contributory negligence of various road users, there is in most cases no 100% certainty as to who is liable for the damage caused and to what extent.

” Offsetting human lives”, “least evil”, qualification according to personal characteristics (old vs. young): Algorithms and machine learning play an important role in autonomous driving. They also make decisions in critical situations and require guidelines for the ethical assessment of accident situations. What developments are emerging?

There is little legal scope for these considerations. The Federal Constitutional Court has clearly rejected a “life versus life” approach in its case law (including the decision on the Aviation Security Act). In an accident situation, an algorithm must not be based on the personal characteristics of a person or on the number of people who would be injured as a result. Instead, the algorithm must be programmed to prevent harm to people.

In its final report in 2017, the Ethics Commission for Automated and Connected Driving presented various theses on these considerations, which met with widespread approval across Europe. The law on autonomous driving takes up these theses when it now regulates them in Section 1e (2) of the German Road Traffic Act (StVG): Motor vehicles with an autonomous driving function must be equipped with an accident avoidance system which is designed to prevent and reduce damage, which takes into account the importance of legal interests in the event of unavoidable alternative damage to various legal interests, with the protection of human life having the highest priority, and which does not provide for any further weighting based on personal characteristics in the event of an unavoidable alternative risk to human life. In addition, the motor vehicle must automatically switch to a condition that minimises the risk if the only way to continue the journey would be to violate road traffic regulations.

What points of contention remain in the regulation of autonomous driving?

One of the unresolved questions is what risks society should be prepared to take when an innovation such as autonomous driving also offers outstanding opportunities. According to a survey conducted by the TÜV, almost half of the German population believes that AI for autonomous driving must work absolutely flawlessly. A good quarter would like the technology to be at least safer than human drivers. More than three-quarters of Germans even want safety-critical AI systems to be tested by independent bodies throughout the vehicle’s life.

From a legal perspective, the current legal framework will need to be analysed in the coming years to determine its practical suitability. If necessary, the existing regulations will have to be adapted and revised. The current requirements are aimed at a very high level of safety and therefore have high bureaucratic barriers. In view of the shortage of skilled workers and the high technical requirements, the question arises as to whether the legal framework that has been created can cope with the reality of the market. To answer this question, however, we need concrete empirical data from the industry and the administration, which will only become available when applications for approval are actually submitted and driverless vehicles are in use.

These are just some of the issues that will continue to occupy our research in the coming years. In the transformation hub MIAMy, which is funded by the BMWK, researchers from various disciplines under the leadership of the NFF (Automotive Research Centre Niedersachsen) at TU Braunschweig are investigating the market launch of autonomous vehicles and formulating hypotheses on how the transformation of the automotive industry could succeed.

The technology is advancing, but the ethical and legal framework is still lagging behind. Could you put it that way?

Yes, although this is nothing new. As a rule, legal developments do not keep pace with technological developments. The law regularly lags behind. However, this should not only be seen in a negative light, because legal norms also need to have a certain consistency, so that the values on which they are based are not simply abandoned. There is always a need to balance interests and values, which requires the involvement of many stakeholders.

However, in the context of automated and autonomous driving in particular, it must also be recognised that the legislator has already created a legal framework with the law on autonomous driving and the AFGBV. It is now up to the practitioners to fill in this legal framework and bring it to life. If manufacturers (still) submit few or no applications for approval of their driving functions, the legal framework cannot be put to the test.

There are no fully autonomous (L4) vehicles on the road yet; after an introductory phase, there will be mixed traffic consisting of conventional and autonomous vehicles. Will this be a challenge from a legal point of view?

Definitely yes. During this transition period, we will be dealing with two different control systems that are not harmonised. All our traffic laws are basically designed for a human driver. For example, the German Road Traffic Act states that participation in road traffic requires constant caution and mutual consideration. The question now is what caution and consideration should actually mean for a driverless vehicle. In the event of a road accident in mixed traffic, there are no uniform liability standards, since strict liability with liability insurance has been established for conventional driving, while the division of responsibility in the digital traffic infrastructure has not yet been tested in practice. It is likely that the legislator will have to reconsider the issue of liability, especially in mixed traffic, but the existing rules have not yet been tested in practice.

How is it regulated when I drive my L4 vehicle across the border into the Netherlands, for example? Is the law harmonised at European level?

In principle, there are international traffic law standards, which are laid down in particular in the Vienna Convention on Road Traffic. It would also be difficult to get the message across to road users that they need to constantly reorient themselves when crossing borders (apart from a few specific regulations). In summer 2022, the European Union also introduced standardised type approval regulations for automated driving systems in fully automated vehicles for the first time with the so-called ADS Regulation (ADS stands for Automated Driving System). However, these vehicles still need to comply with national regulations, for example on the defined operating range or technical supervision. In the area of autonomous driving, the pace of reform varies widely across the European Union and beyond, which means that there is still no uniform European or international law in this area. Therefore, you need to find out what the legal situation is in the country where you want to drive your automated and connected vehicle.

Thank you very much.