Skip to main content

Where Are Our Self-Driving Cars?

2021 Tesla Model Y

2021 Tesla Model Y

Tesla made headlines with the beta launch of its Full Self-Driving system in 2020. It came with a disclaimer saying, "It may do the wrong thing at the worst time, so you must always keep your hands on the wheel and pay extra attention to the road." More recently, the California Department of Motor Vehicles said it was "revisiting" its opinion that Tesla's test program does not fall under the department's autonomous vehicle regulations because it requires a human driver, according to the Los Angeles Times.

Tesla's system has impressive capabilities, but despite what the name suggests, it's definitely not hands-free driving. For nearly a decade, automakers and tech companies kept saying that autonomous vehicles were just a few years away. Yet here we are in 2022 and a true self-driving vehicle is still many years away. Right now, there is no car on sale that can drive itself without requiring the driver to pay attention to the road and be prepared to take control of the vehicle. In fact, some automakers have slowed down their timelines.

With a more realistic eye toward the future, here are three reasons why you can't buy a self-driving car today — and one place you're likely to find them first.

1. We've yet to define how "safe" is safe

It's difficult to teach a machine to react correctly when faced with new or unpredictable situations that we frequently encounter while driving. Heaps of engineering effort has gone into cracking this problem. But how do we determine when a vehicle is safe enough?

In order to be 95% certain that autonomous vehicles match the safety of human drivers, the cars would need to log 275 million failure-free autonomous miles, according to a 2016 report from Rand Corp. And to prove that autonomous vehicles are even just 10% or 20% safer than humans, the requirement jumps to billions of miles. Since 2009, autonomous tech company Waymo's vehicles have driven over 20 million miles.

Either manufacturers must spend potentially decades testing small fleets or the public will wind up taking part in the process of testing them. The latter is only really acceptable if infrastructure is in place to ensure the safety of drivers and pedestrians. Tesla has decided that a test on public roads is a risk it's willing to take. Critics argue, however, that this puts the public at risk.

2. City and state infrastructure isn't in place

Expecting individual autonomous vehicles to operate independently is a recipe for disaster. Each vehicle would have to determine what all the others are doing and predict what they are about to do. Each would rely only on its own limited view of the world, with sensors and cameras that can fail or be obstructed by poor weather or road debris.

Enabling vehicles to communicate with one another reduces the possibility of unpleasant surprises and allows vehicles to make communal decisions to maintain speed and safety. Some cars already have the capability to perform such communication, but there are no rules in place to guarantee cars from different manufacturers will be able to communicate with one another.

Infrastructure specific to autonomous vehicles such as smart traffic lights and camera systems could alert vehicles about pedestrians, cyclists and dangerous road conditions and help prevent accidents. Unfortunately, it has yet to be determined who would pay for the necessary infrastructure upgrades and whether Americans would be willing to accept more surveillance on their roads.

3. It isn't clear who is liable when an accident occurs

As long as self-driving features require the driver to be ready to take control, the driver will remain liable for any accidents. Car manufacturers are only liable if there's a fault in their vehicle. But what happens in the future if an autonomous vehicle without a driver causes an accident? Is the manufacturer liable because it designed the system that's at fault?

Some states are trying to address the question. Florida passed a law saying that the person who initiates a trip in an autonomous vehicle is considered the operator, and while the law doesn't explicitly establish liability, it is laying a foundation for how liability may be addressed. But the process is piecemeal, and so far existing laws haven't faced serious challenges in court.

Until there are laws in place to protect them, it's unlikely automakers will accept the risk of allowing drivers to use self-driving features without requiring them to remain ready to take control of the vehicle.

Which cars will be the first to actually self-drive?

In the near future, the first autonomous vehicles will likely be taxis and cargo trucks. Both industries have remained bullish on autonomy for several reasons.

First, fleet companies can reduce overall employee counts without totally eliminating the human component. Waymo already has a system in place in which human supervisors can intercede to course-correct an autonomous vehicle when it encounters a situation where it's incapable of deciding what to do next.

Second, taxi and trucking companies are already open to liability for their drivers' mistakes. The money they save on payroll should make up for the liability risks if they believe autonomous systems are about as safe as the humans they replace.

Edmunds says

You probably won't be able to buy an autonomous car anytime soon. But expect autonomous fleet services to begin expanding in the near future.


See Edmunds pricing data

Has Your Car's Value Changed?

Used car values are constantly changing. Edmunds lets you track your vehicle's value over time so you can decide when to sell or trade in.

Price history graph example