Federal Regulations Governing Autonomous Systems in the US

Federal oversight of autonomous systems in the United States operates through a fragmented, sector-specific regulatory architecture in which jurisdiction is determined by application domain rather than by technology type. No single federal statute comprehensively governs autonomous systems; instead, authority is distributed across the Federal Aviation Administration, the National Highway Traffic Safety Administration, the Department of Defense, the Food and Drug Administration, and at least eight other agencies applying existing statutory mandates to autonomous conduct. This page maps that regulatory landscape, identifies the primary legal instruments and enforcement bodies, and establishes the classification boundaries that determine which regulatory authority applies in a given deployment context. The autonomous systems defined reference establishes the technical definitions underlying those classifications.



Definition and scope

Federal regulation of autonomous systems applies to any machine, vehicle, aircraft, maritime vessel, robotic platform, or software-defined system capable of performing tasks with reduced or eliminated continuous human control input. The statutory scope is not defined by a single "autonomous systems" statute but derives from domain-specific enabling legislation: the FAA Reauthorization Act of 2018 for unmanned aircraft, Title 49 of the U.S. Code for surface vehicles, the Federal Food, Drug, and Cosmetic Act for autonomous medical devices, and DoD Directive 3000.09 (first issued 2012, updated 2023) for defense applications.

The scope encompasses both fully autonomous systems — those that select and execute actions without real-time human input — and conditionally autonomous systems that require human confirmation for specific decision classes. Industrial robots operating under fixed programming with no environmental sensing generally fall outside the active regulatory focus of autonomous systems frameworks, though they remain subject to OSHA standards under 29 CFR Part 1910.

The regulatory perimeter is further shaped by the levels of autonomy classification framework, which distinguishes six functional levels for ground vehicles (SAE International J3016) and multiple categories for unmanned aircraft and defense systems.


Core mechanics or structure

The federal regulatory structure for autonomous systems operates through four primary mechanisms: rulemaking, voluntary guidance, certification and approval processes, and inter-agency coordination frameworks.

Rulemaking produces binding regulations published in the Code of Federal Regulations. The FAA's Part 107 (14 CFR Part 107), finalized in 2016, established the foundational operating rules for small unmanned aircraft systems (sUAS) under 55 pounds. The FAA Reauthorization Act of 2024 extended and expanded those provisions. NHTSA administers Federal Motor Vehicle Safety Standards (FMVSS) under 49 CFR, though no FMVSS-equivalent has been finalized specifically for autonomous vehicles at the federal level as of the mid-2020s.

Voluntary guidance plays a structurally significant role. NHTSA's Automated Driving Systems guidance (AV 3.0, 2018 and AV 4.0, 2020) is non-binding but operationally consequential — manufacturers reference it in safety cases submitted to NHTSA. The National Institute of Standards and Technology (NIST) contributes the AI Risk Management Framework (AI RMF 1.0), released January 2023, which is incorporated by reference in federal procurement policy.

Certification and approval applies where autonomous systems interface with safety-critical infrastructure. The FDA's Digital Health Center of Excellence oversees Software as a Medical Device (SaMD) under 21 CFR Part 880, including AI-driven diagnostic systems. The FAA's type certification process under 14 CFR Part 21 governs autonomous aircraft above sUAS classification.

Inter-agency coordination is institutionalized through the National Science and Technology Council's Select Committee on Artificial Intelligence and the Cross-Agency Priority Goal on Trustworthy AI, which coordinates policy alignment across 17 participating agencies.

The autonomous systems technology stack provides the technical substrate context against which these regulatory mechanisms operate.


Causal relationships or drivers

Four structural forces shaped the current fragmented federal framework.

Technology outpaced legislative cycles. Commercial drone deployment reached scale before Congress passed drone-specific statutory authority. The FAA Modernization and Reform Act of 2012 directed FAA to integrate UAS into national airspace by 2015 — a deadline the agency missed by years due to the complexity of the task, ultimately producing Part 107 in 2016.

Sector-specific statutory authority constrained agency action. NHTSA's authority derives from the National Traffic and Motor Vehicle Safety Act, which was written for human-operated vehicles. Without new legislation redefining "driver" and "operator," NHTSA has operated primarily through guidance rather than binding rulemaking for autonomous vehicles.

Defense requirements accelerated autonomous weapons policy. DoD Directive 3000.09's 2023 update responded directly to the deployment of increasingly capable lethal autonomous weapon systems, requiring that such systems allow "appropriate levels of human judgment over the use of force" — a standard driven by operational incidents and international law obligations rather than domestic commercial pressure.

State-level action created pressure for federal preemption. By 2023, more than 40 states had enacted autonomous vehicle legislation (National Conference of State Legislatures, Autonomous Vehicles database), creating a patchwork that industry stakeholders lobbied Congress to preempt with federal standards.

The autonomous vehicle regulatory landscape details the state-federal jurisdictional tensions specific to surface vehicles.


Classification boundaries

Regulatory jurisdiction follows application domain, and the boundaries between domains determine which agency holds primary authority.

Aviation domain: FAA jurisdiction attaches when an autonomous system operates as an aircraft. Small UAS under 55 lbs and below 400 feet AGL operating commercially fall under Part 107. Heavier-than-air autonomous aircraft require airworthiness certification under 14 CFR Part 21. Urban Air Mobility (UAM) vehicles — including eVTOL platforms — are being processed under the Special Class Airworthiness provisions of 14 CFR §21.17(b). FAA drone regulations maps the full UAS classification hierarchy.

Surface transportation domain: NHTSA jurisdiction applies to motor vehicles on public roads under 49 U.S.C. §30101 et seq. The SAE J3016 six-level taxonomy (Levels 0–5) is referenced in NHTSA guidance but is not itself codified as federal law. Commercial trucking automation is additionally subject to Federal Motor Carrier Safety Administration (FMCSA) rules under 49 CFR Parts 380–399.

Defense domain: DoD systems operate under Directive 3000.09, which classifies systems as autonomous weapons systems, semi-autonomous weapons systems, or human-supervised autonomous weapon systems. The directive requires Undersecretary of Defense-level approval for systems outside established parameters.

Healthcare domain: Autonomous systems performing diagnostic, therapeutic, or monitoring functions are regulated by FDA as medical devices under 21 CFR. The FDA's 2021 action plan for AI/ML-based SaMD establishes a predetermined change control plan framework that allows iterative updates without full re-submission.

Maritime and subsurface domain: The U.S. Coast Guard (USCG) regulates unmanned surface and underwater vehicles operating in U.S. navigable waters under 33 CFR. No comprehensive federal UMV operating rule equivalent to Part 107 existed as of the mid-2020s.

Industrial and workplace domain: OSHA governs robotic and automated systems in workplaces under the General Duty Clause and 29 CFR Part 1910 Subpart O (Machinery and Machine Guarding). NIOSH produces supplemental safety guidance. The industrial robotics automation services sector operates primarily within this framework.

For professionals navigating architectural and systems design questions across these domains, Robotics Architecture Authority provides reference coverage of hardware and software architecture standards applicable to compliant autonomous system design — including layered control architectures, fail-safe design patterns, and interface standards that intersect directly with regulatory certification requirements.


Tradeoffs and tensions

Prescriptive rules versus performance standards. Aviation regulators historically use prescriptive technical standards (specific avionics requirements, structural load limits). Autonomous system capabilities — which depend on trained models, not fixed hardware — resist prescriptive specification. The resulting tension has pushed FAA and FDA toward means of compliance and performance-based standards, which create interpretive flexibility but reduce legal certainty.

Federal preemption versus state innovation. Seventeen states have passed autonomous vehicle testing frameworks that exceed or contradict federal guidance. Federal preemption would produce uniformity but would eliminate state-level experimentation that has generated operational data. The SELF DRIVE Act (117th Congress, H.R. 3711) proposed preemption for AV safety standards but did not pass into law.

Safety certification timelines versus deployment velocity. FAA type certification for a novel aircraft category historically takes 8–12 years. UAM manufacturers seeking commercial certification by the mid-2020s are operating on compressed timelines that regulators have accommodated through ITAR-governed Special Conditions and exemptions, generating industry concern about precedent effects on safety culture.

Liability allocation under automation. Traditional tort law allocates liability to the operator or driver. When a Level 4 autonomous vehicle causes harm with no human in the control loop, product liability shifts toward manufacturers — a framework that autonomous systems liability insurance practitioners navigate through evolving policy structures.

Dual-use systems. Commercial autonomous platforms (drones, autonomous vessels) are frequently adapted for defense or intelligence collection applications, placing them simultaneously under FAA/USCG civil authority and DoD/IC policy frameworks. The autonomous systems in defense sector encounters this boundary directly.


Common misconceptions

Misconception: A single federal agency regulates all autonomous systems.
No such agency exists. Jurisdiction is domain-specific. An autonomous drone delivering medical supplies crosses FAA (airspace), FDA (medical device cargo handling in some interpretations), and potentially FMCSA (ground vehicle component) jurisdiction depending on the operational profile.

Misconception: SAE Level 5 autonomy is a legal classification.
SAE J3016 is a voluntary industry taxonomy published by SAE International, not a statutory or regulatory classification. NHTSA references it in guidance documents, but it carries no direct legal force. Regulatory obligations attach to specific operational characteristics, not SAE level designations.

Misconception: Part 107 covers all commercial drone operations.
Part 107 covers small UAS under 55 lbs operating under visual line of sight (VLOS) in specific airspace classes. Beyond Visual Line of Sight (BVLOS) operations require separate FAA waivers under 14 CFR §107.200. Operations over people and moving vehicles were restricted until the FAA's 2021 final rule (86 FR 4314) created four operational categories with distinct equipment and remote ID requirements.

Misconception: Voluntary NHTSA guidance has no practical compliance consequence.
While AV guidance documents are non-binding, NHTSA's Standing General Order 2021-01 requires manufacturers and operators of SAE Level 2 and above automated driving systems to report crashes within specified timeframes — a binding reporting requirement issued under existing statutory authority, demonstrating that enforcement tools coexist with non-binding policy frameworks.

Misconception: Federal regulation of autonomous systems is static.
The regulatory environment is structurally dynamic. Executive Order 13960 (2020) on federal AI use was succeeded by the Biden administration's AI Executive Order (EO 14110, October 2023), which directed NIST, DHS, DOE, and other agencies to develop sector-specific AI safety standards and testing frameworks within 90-to-270-day reporting windows.


Checklist or steps

The following sequence describes the federal regulatory touchpoint identification process for an autonomous system deployment — presented as a process map, not as legal or compliance advice.

Step 1: Determine the operational domain.
Identify whether the system operates in airspace, on public roads, in a workplace, in healthcare settings, on navigable waters, or in a defense context. Each domain activates a distinct primary regulatory body.

Step 2: Identify the applicable enabling statute.
Map the domain to its governing statute: FAA Reauthorization Act (aviation), 49 U.S.C. §30101 (surface vehicles), Federal Food, Drug, and Cosmetic Act (healthcare devices), 10 U.S.C. (defense), or 33 U.S.C. (maritime).

Step 3: Locate the implementing regulations in the CFR.
Identify the specific CFR parts governing the system: 14 CFR Part 107 (small UAS), 14 CFR Part 21 (aircraft certification), 49 CFR FMVSS (vehicles), 21 CFR Part 880 (medical devices), 29 CFR Part 1910 (workplace robots).

Step 4: Identify applicable voluntary guidance documents.
Locate NHTSA AV guidance versions, FDA SaMD action plan provisions, NIST AI RMF 1.0, and any agency-specific AI ethics principles that govern procurement or deployment in the target sector.

Step 5: Check for waiver, exemption, or special condition pathways.
Determine whether standard certification or operating rules require waivers (FAA §107.200 for BVLOS) or special conditions (FAA §21.17(b) for novel aircraft categories).

Step 6: Identify state and local regulatory layers.
Cross-reference the federal framework against state autonomous vehicle laws, state drone preemption statutes, and local airspace restrictions (such as FAA-designated Temporary Flight Restrictions and UAS Facility Maps).

Step 7: Assess reporting and incident disclosure obligations.
Determine whether NHTSA Standing General Order 2021-01, FAA accident reporting rules under 49 CFR Part 830, or FDA MDR requirements (21 CFR Part 803) apply to the operational profile.

Step 8: Confirm cybersecurity framework applicability.
Check whether the system is classified as critical infrastructure under the Cybersecurity and Infrastructure Security Agency (CISA) framework, which activates NIST Cybersecurity Framework (CSF 2.0) alignment expectations. The cybersecurity for autonomous systems reference provides the applicable technical control mapping.

The broader autonomous systems regulatory landscape — including procurement structures, government contract pathways, and industry certification services — is indexed at Autonomous Systems Authority.


Reference table or matrix

Domain Primary Agency Key Legal Instrument CFR Location Binding / Voluntary
Unmanned Aircraft (sUAS <55 lbs) FAA FAA Reauthorization Act 2018; Part 107 14 CFR Part 107 Binding
Unmanned Aircraft (>55 lbs / UAM) FAA 14 CFR Part 21 Special Conditions 14 CFR Part 21 Binding
Autonomous Ground Vehicles NHTSA 49 U.S.C. §30101; AV 4.0 Guidance 49 CFR FMVSS Mixed (guidance voluntary)
Commercial Trucking Automation FMCSA 49 U.S.C. §31136 49 CFR Parts 380–399 Binding
AI/Autonomous Medical Devices FDA FD&C Act; SaMD Action Plan 21 CFR Part 880 Binding + Guidance
Defense Autonomous Weapons DoD (OSD) DoD Directive 3000.09 (2023) N/A (DoD policy) DoD policy (binding for DoD)
Workplace Robots / Automation OSHA OSH Act §5(a)(1); General Duty Clause 29 CFR Part 1910 Binding
Maritime UMV / Autonomous Vessels USCG 33 U.S.C.; 33 CFR 33 CFR Partial (evolving)
Federal AI Procurement (all) NIST / OMB EO 14110 (2023); NIST AI RMF 1.0 N/A (policy/guidance) Voluntary (procurement ref.)
IC AI Systems DNI / IC elements IC AI Ethics Principles (2020) N/A (IC policy) IC policy
📜 12 regulatory citations referenced  ·  🔍 Monitored by ANA Regulatory Watch  ·  View update log

Explore This Site

References