There Was a Self-Driving Car Fatality in 2016. What Happened?
Getting behind the wheelSpencer Platt / Getty Business Features Tesla
Humans have always been dangerous behind the wheels of automobiles. That was true sixty, seventy, eighty years ago. Old news and ancient tidings. But there’s a new knot added to the rope: automated vehicles are complicating the deadly business of driving.
According to recent data released by the National Transportation Safety Board on September 12, a man fatally crashed a semi-autonomous Tesla Model S sedan in 2016. He died in part because he was too reliant on the self-driving vehicle’s abilities. The death is reported as being the first fatal crash caused by an automobile under semi-autonomous control. In other words, it’s the first homicide a self-driving car played a part in.
As the Chicago Tribune notes:
An inattentive driver’s over reliance on his Tesla Model S sedan’s semi-autonomous driving system and a truck driver who made a left-hand turn in front of the car are both to blame for a fatal crash last year, the National Transportation Safety Board said Tuesday. … Tech company owner Joshua Brown, 40, of Canton, Ohio, was traveling on a divided highway near Gainesville, Florida, using the Tesla’s automated driving systems when he was killed.
The NTSB has cautioned auto-makers of every stripe to exercise greater care in the manufacture of these vehicles. Safeguards are necessary, the government has warned, to make sure operators don’t use the self-driving systems for unnecessary purposes. Autonomous vehicles, or AVs, have attracted a sizable buzz in Motordom for several years, and the interest is reaching the pitch of fever.
But the traffic death of Mr. Brown raises uncomfortable questions about this new technology. The vehicle-maker, the celebrated Tesla Company, claims its AVs are made to be used mainly on interstates. But as the Tribune pointed out, Tesla declined to add-in protections against using AVs on other kinds of roads. There have been upgrades, but Tesla has not added in the appropriate restraints. Government urging, it is hoped, will drive the automakers to deliver on better driver assistance. AVs cannot be a crutch for what humans fail to do. Or the companies that humans own.
“System safeguards were lacking,” NTSB Chairman Robert Sumwalt said. “Tesla allowed the driver to use the system outside of the environment for which it was designed and the system gave far too much leeway to the driver to divert his attention.”
What do these guards mean? What can they insure, when the Tesla autopilot system is allegedly able to achieve ninety miles per hour on its own say-so? The NTSB has urged the automakers to ” monitor driver attention in ways other than through detecting steering-wheel engagement.” Touching the wheel alone is hardly a replacement for driver awareness; there is not much time for driver reactions, if attention has drifted. Tesla has admitted that AVs require pilot alertness. Do they have awareness about their own corporate policies?
Per The Verge:
“Tesla allowed the driver to use the system outside of the environment for which it was designed, and the system gave far more leeway to the driver to divert his attention to something other than driving,” Board member Christopher Hart said. “The result was a collision that, frankly, should have never happened.”
Our journey to the future is no certain trip. Now, more than ever, we must ask who will be in the driver’s seat, if AVs are our future. Tesla, like its vehicles, must do better.