On March 15, 2019, the University of Michigan School of Law’s Journal of Law and Mobility (in conjunction with the University of South Carolina School of Law) held the “(Re)Writing the Rules of the Road” conference, featuring cutting-edge legal issues, top-notch speakers and vigorous debate and discussion on key legal issues affecting the autonomous vehicle space.
Speakers included academics, the leaders of California’s and Michigan’s autonomous vehicle initiatives, U.S. Representative Debbie Dingell, Justice David Viviano of the Michigan Supreme Court, Ford’s in-house counsel in charge of its AV initiative, Jessica Uguccioni of the UK Law Commission, Karlyn Stanley of RAND and Daniel Hinkle of the American Association for Justice.
Connected and Automated Vehicles – A Technical and Legal Primer
The conference began with the University of South Carolina Law School’s Professor Bryant Smith’s thought-provoking introductory primer about the key concepts, terms and laws applicable to autonomous vehicles. Smith’s presentation illustrated that simple terms such as “driver” are defined differently by various governments and insurance policies, and the varying definitions of “driver” and other key terms are hindering progress in standardizing laws surrounding AVs. There are currently no specific laws preventing Level 1-3 autonomous vehicles from traversing American roads because no current FMVSS specifically addresses autonomous vehicles. Under his view, this means that the federal government has not pre-empted the space. Without existing regulations preventing AVs, AVs are allowable on American roads, subject to any state regulations that do not implicate interstate commerce concerns. Smith also discussed the Uniform Law Commission’s work in the AV space and acknowledged the chair of the ULC’s Highly Automated Vehicles Committee, Thomas Buiteweg, who was in the audience and would be a panelist later in the day.1 Smith also mentioned the American Association of Motor Vehicle Administrator’s work in the area. The AAMVA is studying the effects of AV technology on short and long range plans within vehicle and driver programs to gain an understanding of the effects on government policies and regulations. It has amassed a number of scholarly articles, news media and research papers on its website.
Driver’s Licenses for Robots? State DMV Approaches to CAV Regulation
The second session was a panel discussion with California’s and Michigan’s leaders overseeing AVs. Dr. Bernard Soriano explained that California’s AV law only applies to Level 3 and above AVs. He also explained California’s efforts to assist the 62 entities licensed in California to work on AVs that are currently operating 800+ vehicles2, the relatively low number of accidents involving AVs on public roads3 and his vision of the future of AVs4. Soriano discussed the rationale behind California’s requirement of $5 million in liability insurance for any AV that operates on public roads and the strict testing required for any human operators of AVs, whether in the actual vehicle or remotely controlling the vehicle.
James Fackler discussed Michigan’s unique relationship with carmakers and how Michigan is seeking to assist its hometown industry however possible. In Michigan, the operator of a Level 3 AV must have a driver’s license present, but a Level 4 operator does not have to have a license on hand. However, a remote operator must hold a driver’s license.
Keynote Speaker. Rep. Debbie Dingell of Michigan was the keynote speaker. She explained her great interest and excitement for AVs and said that she is working with Republicans on re-booting the AV Start Act in the House but has personal apprehensions regarding data privacy. Dingell is deeply concerned about the huge amounts of data that will be generated by AVs and is very distrustful of the current way big companies treat data privacy. She argued that the AV Start Act, as it stands now, is unlikely to move forward until that issue is addressed.
Should Automated Vehicles Always Follow the Law? Who or What is the Legal Driver?
The two afternoon panels5 discussed how and whether AVs will need to comply with current rules of the road – rules that were written for drivers who completely control the vehicle. One topic of discussion was regarding the United Kingdom’s commitment to empowering AVs and how it is exploring changes to the rules of the road in the UK. One speaker argued that while statutory law may lag behind the technology, he trusts juries to make the right decision. Also discussed was whether an OEM should permit AVs to violate the current rules of the road. Discussion of that subject was fierce, and it was noted that every OEM currently produces cars that “know” the speed limit and could prevent a driver from exceeding it, yet permit the human driver to exceed the speed limit anyway. In that regard, the only difference between an AV and today’s car is that an OEM presumably is unwilling to accept the liability of an AV violating the law, whereas it can shift the blame to the human operator of today’s cars should a legal violation occur that today’s car could have prevented.
Another speaker emphatically argued that if a fully autonomous vehicle (i.e., one that requires no input from a human to operate) causes an accident, the manufacturer should be identified as the driver, but the plaintiff would have to prove negligent operation of the vehicle to hold the company liable. Or, the plaintiff could try for a product liability theory. His suggestion is that we shouldn’t abandon driver liability theories simply because the driver role is being occupied by a company that drives by algorithm. That, too, was met with fierce debate, as both the panelists and the audience argued that a manufacturer cannot conceivably program a vehicle to anticipate and respond (in a split-second) to every single driving decision, especially when there are non-autonomous vehicles on the road and no V2V communication. To put that onus on the manufacturers will delay production significantly. Another participant argued that there should be a more nuanced approach, more akin to a negligence-based liability rather than strict liability, and it must take into account whether the car and/or its software had been modified in any way since leaving the hands of the manufacturer. Whatever the liability regime is, even if our roads become safer with AVs, it was discussed that insurance premiums will be higher, not lower, because of the increased costs of the cars and inability of most body shops to repair their complicated technology following an accident. That will likely lead to slower adoption by consumers, and at least at first, AVs will likely be leased to fleets and sold only to affluent early adopters.
The conference ended with a robust Q&A session moderated by Prof. Smith. The discussion was so lively, the session ran 30 minutes late – a testament to the enthusiasm and determination with which industry experts are seeking to understand the legal environment for AVs.
For assistance with legal matters affecting the autonomous vehicle industry, please contact the author, Neel Gupta.
1 The ULC is working on a uniform law covering the deployment of automated driving systems (SAE levels 3 through 5). The committee is looking at a number of legal and policy issues, including driver licensing, vehicle registration, insurance, vehicle equipment, and rules of the road, among others. The goal of the committee is to reconcile automated driving with a typical state motor vehicle code.
2 Half of the 800 vehicles in California are either operated by Waymo or Cruise Automation.
3 136 accidents at the time of the presentation, most of which were extremely minor and would not usually be reported to law enforcement if they were traditional vehicles.
4 Beginning on April 18, 2019, California will allow vehicles without a physical driver present to operate on its roads. So far, three companies have applied for permission to operate such vehicles, and the only approved company so far is Waymo. The vehicles must be non-commercial and under 10,000 pounds.
5 Because the afternoon sessions were conducted under the Chatham House Rule, participants and audience members agreed not to identify speakers or their organizations