As self-driving car technology advances, people might be tempted to have a few drinks and hope their cars get them home safely. But using a self-driving car while under the influence of drugs or alcohol most likely violates the DUI laws in your state.
Currently, there aren't any fully autonomous self-driving cars that the public can buy. Today's so-called self-driving cars—like Teslas—can assist people with many driving functions. But these cars still need a human driver to be alert, awake, and in control of the vehicle. Because the operators of self-driving cars must remain in control, they generally can be convicted of a DUI if they're under the influence of drugs or alcohol.
Before we talk about DUIs in self-driving cars, we need to understand what a self-driving car is. Most cars on the road today have some self-driving technology to make the operator's job easier and safer.
The National Highway Traffic Safety Administration has established 6 levels of automation. Levels 0 through 2 have little to no self-driving technology. Level 2 vehicles provide continuous assistance with acceleration, braking, and steering. But the driver must stay fully engaged and attentive for safe operation. For the time being, nearly everyone in the United States is driving a vehicle with level 2 automation or below.
A level 3 self-driving vehicle can do most of the driving on its own but still needs a human operator to be attentive and ready to take control. Mercedes-Benz currently is the only manufacturer approved to sell level 3 vehicles in the United States—but only in California and Nevada for now. Even Tesla's current autonomous driving systems are classified only as level 2.
A level 4 self-driving vehicle operates with no human assistance but only in a limited service area. A level 5 self-driving vehicle is fully automatic and can operate anywhere under any conditions without a human driver. People in level 4 and 5 vehicles are merely passengers and don't need to do anything. However, no level 4 or 5 vehicles are currently available for the public to buy.
Now that you know the different types of self-driving cars, you can see that everything available on the market requires a human operator. If that human operator is under the influence of alcohol or drugs, they likely will get a DUI conviction if caught by police.
Every state defines a DUI in essentially the same way. To prove a DUI in court, a prosecutor generally must show the person was:
So the question is whether someone operating a self-driving car falls under this definition.
Of course, the term "under the influence" means the same thing regardless of the type of vehicle. However, self-driving technology raises questions about what it means to be driving, operating, or in physical control of vehicle.
In most states, you don't have to be actually driving to get a DUI—operating or being in actual physical control of a vehicle is enough. In states that use the "actual physical control" standard, just being intoxicated in a parked car with the electrical system (for example, the radio) or the engine turned on is usually sufficient for a DUI conviction.
So, it's pretty clear that using a level 3 or below self-driving car counts as operation or physical control. But what about states that require putting the vehicle in motion (actual driving)?
Even a level 3 self-driving vehicle requires a person to turn it on and take some action—like putting the car in drive, pressing the accelerator, or turning the wheel—to get the vehicle in motion. Plus, the operator needs to stay alert and monitor the vehicle as it drives. Because of this level of involvement, a person who's under the influence could probably get a DUI for operating a level 3 vehicle.
Waymo, an automated taxi service operating in a few cities, currently operates at level 4 autonomy. In other words, a Waymo vehicle can pick you up and drive you to a location within its limited service area without a human operator in the vehicle. (Some Waymo vehicles still have a human operator present to monitor the car's performance.) Waymo's rules prohibit alcohol or drug use in their vehicles but not already intoxicated passengers.
But can a drunk passenger who tells the Waymo vehicle where to go be considered the operator and consequently exposed to DUI charges?
If there's a Waymo employee in the vehicle's driver seat, you should be fine. But if there's no employee in the vehicle, it's less certain. It's at least plausible that programing a driverless car to take you somewhere would count as operation, physical control, or driving in some states. And if the vehicle has controls (such as a steering wheel)—even if you don't touch them—then it might be more likely you'll be on the hook for a DUI.
To be on the safe side, it's probably best to avoid riding in an automated taxi service while intoxicated, unless the vehicle has a human operator or it's clear under your state laws that doing so is lawful.
When technology advances, it often takes the law months or years to catch up. States currently don't have DUI laws on the books that make exceptions for level 4 or level 5 autonomous vehicles. Until these vehicles are more common, it's unlikely that states will rush to create new laws.
For the time being, courts will have to use current laws to decide whether someone using a self-driving car is an operator or merely a passenger. Under the current laws, courts will likely make this determination by looking at factors such as the level of autonomy of the vehicle, whether the vehicle has a steering wheel and gas pedal, and where in the vehicle the person was sitting.