This post offers a primer on federal legislation, regulation, and policy with respect to automated vehicles (AVs) – that is, motor vehicles equipped with an automated driving system (ADS) – under the Trump and Biden administrations.
Federal Legislation
AVs can be – and in fact are – lawfully deployed and regulated under existing federal statutory law. New legislation may be desired or even desirable, but it is not strictly necessary. Outside of some minor funding provisions, Congress has never passed legislation specific to automated driving. But this means neither that automated driving is illegal nor that automated driving is unregulated.
As background: The National Traffic and Motor Vehicle Safety Act authorizes the US Department of Transportation (USDOT)’s National Highway Traffic Safety Administration (NHTSA) to enact Federal Motor Vehicle Safety Standards (FMVSS) for the performance of new motor vehicles and equipment. Manufacturers must self-certify compliance with these standards.
Regarding deployment: While the Vehicle Safety Act precludes NHTSA from exempting more than 2,500 vehicles per manufacturer per year from the FMVSS, NHTSA itself promulgates these standards – and can therefore change or clarify them in a way that obviates the need for exemptions. (As it has in part.)
Regarding regulation: NHTSA already has statutory authority to enact new safety standards, to conduct investigations, and to pursue recalls. Other modal administrations within USDOT – such as the Federal Motor Carrier Safety Administration (FMCSA) and the Federal Transit Administration (FTA) – also have relevant regulatory authority.
Federal legislation specific to automated driving might nonetheless provide a useful signal to developers and regulators. And many stakeholders do support legislation – provided that it contains what they want and only what they want. Prior federal legislative efforts have accordingly faltered over disputes about federalism, commercial trucking, forced arbitration, regulatory discretion, and industry trustworthiness.
(The automotive and information-technology industries have also faced more pressing legislative and regulatory issues during the Trump and Biden administrations, and this is likely to be true under President-elect Trump’s second term.)
Senator – and soon-to-be Majority Leader – John Thune has long sought federal legislation that is friendly to automated trucking. He has also continued to support the 60-vote threshold to pass most (though not all) legislation in that chamber.
Roughly every five years, Congress reauthorizes its federal surface transportation programs. The most recent is Public Law 117-58 from 2021. Better known as the Bipartisan Infrastructure Law (BIL) or the Infrastructure Investment and Jobs Act (IIJA), it authorizes these and other programs for fiscal years 2022 through 2026. (It also appropriates funding to some of these and other programs, which is as confusing as it sounds.)
Automated driving makes a few minor appearances in BIL and may receive more attention when Congress eventually begins considering BIL’s successor.
Federal Administrative Regulation
USDOT comprises “modal” administrations including (among others) NHTSA, FMCSA, FTA, the Federal Highway Administration (FHWA), and the Federal Aviation Administration (FAA). USDOT has issued a series of AV plans under the Obama, Trump, and Biden administrations.
President-elect Trump intends to nominate former U.S. Representative Sean Duffy to lead USDOT as Transportation Secretary. (Mr. Duffy did not cosponsor the House’s 2017 SELF-DRIVE Act. He apparently supports a bipartisan right-to-repair bill that would have required motor vehicle manufacturers to provide vehicle owners with real-time data relevant to vehicle servicing and repair. As a lobbyist, he had a few clients in transportation and information technology.)
Like the other modal administrations, NHTSA is led by an administrator who is nominated by the President and confirmed by the Senate. At least in theory. In practice, NHTSA was without a Senate-confirmed administrator for the entirety of the first Trump Administration and for all but one year of the Biden Administration.
The centerpiece of NHTSA’s current approach to regulating automated vehicles is its Standing General Order (SGO) requiring some 100 companies to “report to the agency certain crashes involving vehicles equipped with automated driving systems or SAE Level 2 advanced driver assistance systems.” The SGO was first issued in 2021 and updated in 2023.
Project 2025’s chapter on the Department of Transportation criticizes the SGO as a “compulsory and antagonistic approach” to data collection. (More on Project 2025 here.)
NHTSA has also used its broad investigation authorities to scrutinize incidents and allegations involving automated vehicles, including those of Waymo, Cruise, and Zoox. (I use the term “investigation” in a broad sense to encompass a wide range of NHTSA proceedings.)
Automakers are obligated to recall motor vehicles and equipment that, because of design or manufacture, pose an unreasonable risk to safety – even if they complied with the FMVSS. Waymo and Cruise have done so.
Under the first Trump Administration, NHTSA pursued a new rule for an overarching approach for regulating automated vehicles and FMCSA pursued a new rule for commercial motor vehicles. The Biden Administration has not moved forward with these rules (although FMCSA did eventually seek supplemental information and at least contemplates a proposal yet this year).
NHTSA under then-President Trump also proposed a rule clarifying how FMVSS for occupant protection should apply to vehicles without a human driver, and NHTSA under President Biden ultimately issued a similar rule.
However, there is still not clarity about whether, and if so how, driverless vehicles lacking mirrors and a few other accoutrements not related to occupant protection can be self-certified. Manufacturers have taken several approaches to these questions:
· Google’s Self-Driving Car Project (which would later become Alphabet’s Waymo) asked NHTSA under the Obama Administration to interpret these (and other) provisions of the FMVSS. NHTSA’s answer was mixed – and was partially rescinded under the first Trump Administration.
· After NHTSA committed in 2016 to “ruling on simple HAV-related exemption requests in six months,” several manufacturers indeed submitted exemption requests – albeit complex ones. NHTSA eventually granted Nuro’s request. It sat on requests by GM and Ford until those companies eventually withdrew them – in GM’s case, shortly after its Cruise unit misled the public about a crash.
· Zoox has self-certified its own driverless vehicle as wholly compliant with the FMVSS. (NHTSA appears to be somewhat skeptical of Zoox’s analysis.)
· Waymo seems likely to simply add mirrors to the next generation of its automated vehicles.
If NHTSA chooses to address these questions, it could do so by promulgating a new rule (similar to its occupant-protection rule) or by quickly issuing a new interpretation letter.
Meanwhile, FMCSA is still considering a petition from Waymo and Aurora for an exemption from the requirement that warning devices be placed around (and away from) commercial motor vehicles (CMVs) that are stopped on the side of the road. And the agency may again be asked to address its hours-of-service regulations in the context of increasing automation.
The incoming Trump Administration reportedly plans to seek “a framework for regulating self-driving vehicles.” (However, this linked article incorrectly states that “mass adoption of self-driving cars likely will require a broader act of Congress.”)
USDOT is not the only part of the federal government that plays a role in automated driving. In particular:
· The US National Transportation Safety Board (NTSB) also investigates specific motor vehicle crashes and makes safety recommendations – but it is distinct from USDOT and does not have regulatory authority.
· The US Department of Commerce’s Bureau of Industry and Security is pursuing a rule to “prohibit the sale or import of connected vehicles integrating specific pieces of hardware and software, or those components sold separately, with a sufficient nexus to the People’s Republic of China (PRC) or Russia” – and with a particular focus on automated vehicles.
· The US Department of Justice (DOJ) and the US Securities and Exchange Commission (SEC) have reportedly initiated investigations of Cruise.
Federal Preemption of State Law
Federalism “refers to the division and sharing of power between the national and state governments.” The federal government has limited powers – but those powers do trump state law.
The US Constitution’s commerce clause provides the legal basis for the relevant parts of the National Traffic and Motor Vehicle Safety Act – and would likely provide a legal basis for similar legislation specific to automated driving.
In turn, the Vehicle Safety Act expressly provides that “When a motor vehicle safety standard is in effect ..., a State or a political subdivision of a State may prescribe or continue in effect a standard applicable to the same aspect of performance of a motor vehicle or motor vehicle equipment only if the [state] standard is identical to the [federal] standard ....”
This express preemption provision has a few explicit qualifications. States may “enforce a standard that is identical to” an FMVSS. They may insist on a higher level of performance in their own procurement processes. And in litigation under state law, a judge or jury may find that a product is defective even if it complied with every FMVSS – although the US Supreme Court held in a notoriously incongruent pair of cases that NHTSA may in fact preempt such a finding.
The Vehicle Safety Act's preemption also has some implicit qualifications.
States may prescribe standards with respect to aspects of vehicle performance for which there are no FMVSS. And currently, the FMVSS do not explicitly state how an automated driving system must perform – although they do address vehicle aspects that may implicate, relate to, or involve an ADS (such as brakes and lights and occupant protection).
Most importantly, states retain primary authority for ongoing operational safety – at least for noncommercial drivers and driving. (Indeed, the US government has long cited federalism to explain its ambivalence toward certain multilateral treaties on road traffic.) States generally license human drivers. They set rules of the road – though these rules can be influenced by strong federal incentives. They may declare an individual vehicle to be unroadworthy – during registration or a periodic inspection or a traffic stop. They may ticket a driver for not wearing a seatbelt or otherwise operating a vehicle while it is in a dangerous condition.
AV bills that were introduced (and not passed) in previous sessions of Congress directly – if clumsily – attempted to address preemption. But even under the existing statutory framework for motor vehicle safety, automated driving could muddy this picture in two ways.
First, an FMVSS specific to automated driving could arguably preempt some state substantive rules regarding automated vehicles. Whether intentional or incidental, this preemption may be especially relevant to legal developments in the next few years. Any such rulemaking would seem to come within a longstanding executive order directing agencies to consult with state and local officials early in the rulemaking process and to prepare a “federalism summary impact statement.”
Second, and perhaps less imminently relevant, a deficiency in any individual automated vehicle could constitute a defect under the Vehicle Safety Act. Currently, a vehicle that was safe when it was first sold does not become defective just because it is now missing a mirror, has bald tires, is improperly loaded, or is being operated while its occupants are not properly belted. But in the future, an automated driving system could be considered defective if it engages on such a vehicle (just as a human would be unreasonable for operating such a vehicle).
Tesla and Elon Musk
It may seem odd that I have yet to mention Tesla. The reason is simple: Tesla does not manufacture, sell, or operate automated vehicles. Right now, there are real AVs carrying real people on real roads, but none of them are Teslas.
The company’s “Autopilot” and so-called “Full Self-Driving” features are functionally classifiable as SAE Level 2 driver assistance. In an dizzying non-sequitur, the company warns the human drivers using “Full Self-Driving” to “pay extra attention to the road.”
Nonetheless (and at least for now), Elon Musk is obviously the metaphorical elephant in the metaphorical (and physical) room.
Given this, it is noteworthy that Tesla’s approach to its driver assistance systems has given rise to multiple NHTSA investigations, at least one relevant recall (among the many that automakers undertake), an investigation of that recall, multiple NTSB investigations, and, reportedly, a DOJ criminal investigation.
The overwhelming majority of crash reports under NHTSA’s SGO have been from Tesla (for “Level 2 ADAS-Equipped Vehicles”) – but this is at least in part due to Tesla’s advanced telematics. It is rumored that some of the other companies have adopted more of a head-in-the-sand approach to potentially reportable incidents.
I cannot speak to whether or how the federal roles that President-elect Trump envisions for Mr. Musk implicate federal ethics laws.