Two recent incidents involving Tesla autonomous vehicles clarified that: (A) self-driving cars are still a work in progress and (B) owners of self-driving cars are still liable for accidents.
In one case, an intoxicated driver who passed out behind the wheel tried to blame his car. He was still arrested for drunk driving. And no, the California Highway Patrol wryly added, the vehicle did not drive itself to the impound lot.
Can self-driving cars be designated drivers?
As more and more autonomous vehicles join us on the highways (including some test trucks), it is becoming apparent “robot cars” are not foolproof. They may be statistically safer than human drivers, but there are still bugs to work out.
In the first incident, a man was found passed out in his Tesla on the San Francisco Bay Bridge. The occupant, whose blood alcohol was twice the legal limit, told the California Highway Patrol that he was not driving because his car was on autopilot. That did not absolve him. He was arrested and charged with drunk driving.
Tesla (which can track such things) did not verify whether the car’s autopilot was engaged. But the company reiterated that their cars are meant to assist drivers, not replace them. The autopilot feature is meant to be activated only with a fully functioning driver behind the wheel, Tesla said in a statement.
Truly autonomous cars may be the future, perhaps the near future, but we’re not there yet. For now, an intoxicated person is still breaking the law by letting the autopilot take over the controls and still criminally (and civilly) liable for an alcohol-related crash resulting in property damage, injury or death.
What if your vehicle crashes in autopilot mode?
In the second incident, also in California, a Tesla on autopilot slammed into the back of a fire truck stopped on the freeway. Although the car was estimated to be traveling 65 mph at impact, remarkably no one was injured. The National Transportation Safety Board is investigating both the driver and the vehicle.
Tesla presumes a “fully attentive” driver in the cockpit while the autopilot is on. That doesn’t necessarily let Tesla off the hook if the company is sued, as it inevitably will be. Litigation may get very complicated, as owners of autonomous cars, automakers like Tesla, and technology subcontractors who provide components or software point fingers at each other – or at the human operator. Lawsuits involving autonomous cars may involve both product liability and driver negligence claims.
Researchers have already identified a disturbing trend of assistive technologies – such as back-up cameras and crash avoidance radar — such advances are making human drivers dangerously complacent. It’s hard to concentrate when the car is doing most of the work. It’s tempting to give in to smartphones and other distractions.
One day fully autonomous cars will safely take us where we want to go, like taxis. But the present era of semi-autonomous cars may be the most risky and most legally muddled.