The Department of Transportation (DOT) has published its first federal guidelines on how to safely integrate and regulate self-driving cars onto our roads. And the guidelines, which are a regulatory blueprint to “accelerate the HAV [highly autonomous vehicle] revolution,” clearly show the federal government is in favor of self-driving cars.
The guidelines, which you can read all 116 pages of below, are broken into four sections:
- 1. Vehicle safety and performance guidelines
- 2. The roles of federal and state governments
- 3. How NHSTA will use its existing regulatory tools
- 4. New regulatory tools the NHTSA might need
Let’s explore each of these sections.
15-Point Safety Assessment for Self-driving Cars
The government is actually telling self-driving car developers what federal regulators will expect, which will certainly help guide development. The vehicle performance section of the guidelines details a 15-point safety assessment that regulators will use to determine whether a self-driving car is safe enough. Again, you can read complete details of each point below, but here’s a quick synopsis of each:
1. Data Recording and Sharing
Manufacturers should have a documented process for testing, validation, and collection of event, incident, and crash data, for the purposes of recording the occurrence of malfunctions, degradations, or failures in a way that can be used to establish the cause of any such issues.
Data should be collected for both testing and operational (including for event reconstruction) purposes. For crash reconstruction purposes (including during testing), this data should be stored, maintained, and readily available for retrieval by the entity itself and by NHTSA.
AV Policy Guidance PDF by ahawkins8223 on Scribd
2. Privacy
Manufacturers must provide consumers with accessible, clear, meaningful data privacy and security notices/agreements which should incorporate the baseline protections outlined in the White House Consumer Privacy Bill of Rights and explain how Entities collect, use, share, secure, audit, and destroy data generated by, or retrieved from, their vehicles.
3. System Safety
Manufacturers should take a systems-engineering approach to designing self-driving cars free of unreasonable safety risks. This process should ensure the vehicle will be placed in a safe state even when there are electrical, electronic, or mechanical malfunctions or software errors.
4. Vehicle Cybersecurity
The systems-engineering approach should also minimize cybersecurity threats. The identification, protection, detection, response, and recovery functions should be used to enable risk management decisions, address risks and threats, and enable quick response to and learning from cybersecurity events.
5. Human Machine Interface
The human machine interface (HMI) should be made for the human driver, operator, occupant(s), and external actors with whom the HAV may have interactions (other vehicles, pedestrians, etc.). The HMI design should also consider the need to communicate information to pedestrians, conventional vehicles, and automated vehicles. At a minimum, indicators should be capable of informing the human operator or occupant that the HAV system is:
- 1. Functioning properly
- 2. Currently engaged in automated driving mode
- 3. Currently “unavailable” for automated driving
- 4. Experiencing a malfunction with the HAV system
- 5. Requesting control transition from the HAV system to the operator.
6. Crash Worthiness
Regulators want to make sure self-driving cars will protect passengers in the event of an accident, regardless of who is at fault. The self-driving cars are expected to meet NHTSA crash worthiness standards.
This chart compares how the 15-point assessment should be applied to SAE Level 2-5 autonomous vehicles. (Credit: NHTSA)
7. Consumer Education and Training
Proper education and training is imperative to ensure safe deployment of automated vehicles. Manufacturers and other parties need to develop, document, and maintain employee, dealer, distributor, and consumer education and training programs to address the anticipated differences between self-driving cars and conventional cars. These programs should cover topics that include: a self-driving car’s operational parameters, capabilities and limitations, engagement/disengagement methods, HMI, emergency fall back scenarios, operational boundary responsibilities, and potential mechanisms that could change function behavior in service.
8. Registration and Certification
The NHTSA needs a description of each self-driving car and the its components. Manufacturers should also provide on-vehicle means to readily communicate
concise information about the capabilities of the self-driving car to human drivers.
9. Post-Crash Behavior
Manufacturers need a documented process for the assessment, testing, and validation of how their self-driving car is reinstated into service after being involved in a crash. For example, if sensors or critical safety control systems are damaged, the vehicle should not be allowed to operate in autonomous mode.
The DOT has abandoned the NHTSA’s levels-of-automation classification for SAE’s. (Credit: SAE)
10. Federal, State and Local Laws
Traffic laws vary by state, but self-driving cars need to be able to follow all laws that apply to the environment in which it’s operating. There also needs to be a way to update self-driving cars to laws that change. Regulators also expect self-driving cars to be able to handle critical situations where human drivers temporarily violate certain motor vehicle laws for safety reasons. For example, having to cross double lines on the roadway to travel safely past a broken-down vehicle.
11. Ethical Considerations
This is one of the most debated topics surrounding self-driving cars, and regulators want to ensure ethical decisions are made consciously and intentionally. Algorithms for resolving these situations should be developed transparently using input from Federal and State regulators, drivers, passengers and vulnerable road users, taking into account the consequences of a self-driving car’s actions.
12. Operational Design Domain
Each self-driving car needs to have a clear Operational Design Domain (ODD) that explains the environment(s) in which it is designed to properly operate. The defined ODD should specify road types the self-driving car can handle, ideal geographical areas, speed range, and environmental conditions such as weather and time of day.
13. Object and Event Detection and Response
Within its ODD, self-driving cars are expected to be able to detect and respond to other vehicles, pedestrians, cyclists, animals, and objects that could affect safe operation.
14. Fall Back (Minimal Risk Condition)
Self-driving cars should be able to transition to a minimal risk condition when a problem is encountered. The cars should be able to detect that their autonomous mode has malfunctioned, is operating in a degraded state, or is operating outside their ODD. The human driver needs to be made aware of this to take control of the vehicle. There also needs to be fall back strategies should human drivers be inattentive, under the influence of alcohol or other substances, drowsy, or physically impaired.
15. Validation Methods
Given the scope, technology, and capabilities vary widely for different automation functions, manufacturers should develop tests to ensure a high level of safety in the operation of their self-driving cars. Tests should include a combination of simulation, test track, and on-road testing.
The Role of Federal and State Governments
The DOT outlines a Model State Policy that, if adopted, will create a consistent national framework for regulating all levels of automated vehicles and avoid “a patchwork of inconsistent laws and regulations among the 50 States and other U.S. jurisdiction, which could delay the widespread deployment of these potentially lifesaving technologies.”
Here’s how the guidelines breakdown federal and state responsibilities:
Federal (NHTSA) responsibilities include:
- Setting Federal Motor Vehicle Safety Standards (FMVSS) for new motor vehicles and motor vehicle equipment
- Enforcing compliance with the FMVSS
- Investigating and managing the recall and remedy of non-compliances and safety-related defects and recalls on a nationwide basis
- Communicating with and educating the public
- Issuing guidance for vehicle and equipment manufacturers to follow
State governments would be responsible for:
- Licensing (human) drivers and registering motor vehicles in their jurisdictions
- Enacting and enforcing traffic laws and regulations
- Conducting safety inspections, where States choose to do so
- Regulating motor vehicle insurance and liability
In summary, “these general areas of responsibility should remain largely unchanged for HAVs. DOT and the Federal Government are responsible for regulating motor vehicles and motor vehicle equipment, and States are responsible for regulating the human driver and most other aspects of motor vehicle operation.”
Current and Needed Regulatory Rules
The NHTSA has four tools – letters of interpretation, exemptions from existing standards, rulemakings to amend existing standards or create new standards, and enforcement authority to address defects that pose an unreasonable risk to safety – that it says will be sped up to aid the development of self-driving cars.
The DOT also hinted at regulatory tools that it will explore to better regulate self-driving cars. For example, the system currently allows automakers to self-certify that their technologies meet federal safety standards. The DOT is exploring “pre-market approval” that would require automakers to clear technologies through the DOT before hitting the market.
The DOT is also considering “cease and desist” power that would allow it to tell a manufacturer to immediately stop production if some unanticipated safety concern becomes evident.
President Obama on Self-Driving Cars
President Barack Obama wrote an op-ed for the Pittsburgh Post-Gazette on Monday showing his support for self-driving cars. Here’s the op-ed reprinted below:
Things are a little different today than when I first moved into the White House. Back then, my watch told me the time. Today, it reminds me to exercise. In my first year, I couldn’t take pictures with my phone. Last year, I posted on Instagram from Alaska.
Of course, American innovation is driving bigger changes, too: In the seven-and-a-half years of my presidency, self-driving cars have gone from sci-fi fantasy to an emerging reality with the potential to transform the way we live.
Right now, too many people die on our roads – 35,200 last year alone – with 94 percent of those the result of human error or choice. Automated vehicles have the potential to save tens of thousands of lives each year. And right now, for too many senior citizens and Americans with disabilities, driving isn’t an option. Automated vehicles could change their lives.
Safer, more accessible driving. Less congested, less polluted roads. That’s what harnessing technology for good can look like. But we have to get it right. Americans deserve to know they’ll be safe today even as we develop and deploy the technologies of tomorrow.
That’s why my administration is rolling out new rules of the road for automated vehicles – guidance that the manufacturers developing self-driving cars should follow to keep us safe. And we’re asking them to sign a 15-point safety checklist showing not just the government, but every interested American, how they’re doing it.
We’re also giving guidance to states on how to wisely regulate these new technologies, so that when a self-driving car crosses from Ohio into Pennsylvania, its passengers can be confident that other vehicles will be just as responsibly deployed and just as safe.
Regulation can go too far. Government sometimes gets it wrong when it comes to rapidly changing technologies. That’s why this new policy is flexible and designed to evolve with new advances.
There are always those who argue that government should stay out of free enterprise entirely, but I think most Americans would agree we still need rules to keep our air and water clean, and our food and medicine safe. That’s the general principle here. What’s more, the quickest way to slam the brakes on innovation is for the public to lose confidence in the safety of new technologies.
Both government and industry have a responsibility to make sure that doesn’t happen. And make no mistake: If a self-driving car isn’t safe, we have the authority to pull it off the road. We won’t hesitate to protect the American public’s safety.
Even as we focus on the safety of automated vehicles, we know that this technology, as with any new technology, has the potential to create new jobs and render other jobs obsolete. So it’s critical that we also provide new resources and job training to prepare every American for the good-paying jobs of tomorrow.
We’re determined to help the private sector get this technology right from the start. Because technology isn’t just about the latest gadget or app – it’s about making people’s lives better. That’s going to be the focus of the first-ever White House Frontiers Conference on Oct. 13. And what better place to hold it than Pittsburgh – a city that has harnessed innovation to redefine itself as a center for technology, health care and education.
We’ll explore the future of innovation in America and around the world, focusing on building our capacity in science, technology and innovation, as well as the new technologies, challenges and goals that will shape the next century.
The progress we’ve seen in automated vehicles over the past several years shows what our country is capable of when our engineers and entrepreneurs, our scientists and our students – backed by federal and private investment – pour their best work and brightest ideas toward a big, bold goal. That’s the spirit that has propelled us forward since before the automobile was invented. Now it’s up to us to keep driving toward a better future for everyone.