#HTE

The First Self-Driving Car Fatality Has Occurred—and It Sounds Grisly

When Tesla first released their Autopilot feature, skeptics figured it was just a matter of time before it got someone killed.

Plenty of Autopilot “saves” (where the software correctly detected a threat and acted to prevent an accident) can be found on YouTube. But a video clip that you’re not going to find is of the one poor soul whose Model S drove underneath a semi-trailer in Florida, apparently at highway speed. The Tesla intersected the trailer at a right angle, and the bottom of the trailer was level with the windshield. You can do the math. The lone driver—or is that “passenger"—did not survive.

While the crash is all over the news this morning, this did not happen this week, nor even this month; the fatal crash occurred on May 7th. However, it’s just yesterday that Tesla released a statement. If you’re wondering "Why the delay,” it’s probable that the police and ambulance crews responding to the crash had no way of knowing what mode the car was in; but Tesla followed policy and alerted the National Highway Traffic Safety Administration after the wreck, and now the situation’s come to a head.

Tesla’s statement opens up with numbers pointing out that their Autopilot is still, statistically speaking, safer than taking your own chances.

We learned yesterday evening that NHTSA is opening a preliminary evaluation into the performance of Autopilot during a recent fatal crash that occurred in a Model S. This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles. It is important to emphasize that the NHTSA action is simply a preliminary evaluation to determine whether the system worked according to expectations.

Here are the known details of the crash:

What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.

Tesla then points out that Autopilot is still technically in Beta, and mentions the intended role of the driver within their system:

It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled. When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot “is an assist feature that requires you to keep your hands on the steering wheel at all times,” and that “you need to maintain control and responsibility for your vehicle” while using it. Additionally, every time that Autopilot is engaged, the car reminds the driver to “Always keep your hands on the wheel. Be prepared to take over at any time.” The system also makes frequent checks to ensure that the driver’s hands remain on the wheel and provides visual and audible alerts if hands-on is not detected. It then gradually slows down the car until hands-on is detected again.
We do this to ensure that every time the feature is used, it is used as safely as possible. As more real-world miles accumulate and the software logic accounts for increasingly rare events, the probability of injury will keep decreasing. Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert. Nonetheless, when used in conjunction with driver oversight, the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety when compared to purely manual driving.

Interestingly, it appears the crash victim was someone known to Tesla and apparently a proponent of electric vehicles:

The customer who died in this crash had a loving family and we are beyond saddened by their loss. He was a friend to Tesla and the broader EV community, a person who spent his life focused on innovation and the promise of technology and who believed strongly in Tesla’s mission. We would like to extend our deepest sympathies to his family and friends.

The incident will undoubtedly spark debate about self-driving cars, and the utility of Autopilot versus safety. By now you’ve undoubtedly seen the video of a random Tesla driver sleeping behind the wheel in traffic:

How could he have possibly reacted had the car been traveling at a higher speed and if something had gone wrong? Human attention isn’t like a light switch and can take a moment or two to come online; is Autopilot a good idea when accidents can occur in the blink of an eye?

Lastly, can you take comfort in the numbers showing that there’s a greater chance of being killed without Autopilot? Or is the thought of being killed due to machine error rather than your own too frightening for numbers to sway you?


http://www.core77.com/posts/54362/The-First-Self-Driving-Car-Fatality-Has-Occurred%E2%80%94and-It-Sounds-Grisly