And Elon Musk isn't happy with the way it's being covered
Another Tesla Model S has been involved in an accident with Autopilot engaged, raising more questions about how humans interact with semi-autonomous driver assist systems.
Police say the car smashed into a stationary firetruck at around 100km/h, leaving the driver with a broken right ankle. The airbags in the car fired, while no-one from the firetruck was injured.
The car's owner, a 28-year-old woman from Utah, told police Autopilot was engaged and she was playing with her phone when the accident occurred. According to witnesses, the Model S didn't brake before impact.
Subsequent research into the car's systems has revealed the driver had their hands off the wheel more than a dozen times during the drive leading up to the crash, only retaking control after visual warnings from the car. About 82 seconds before the crash, she activated autosteer and active cruise control and took her hands off the steering wheel.
Perhaps unsurprisingly, Elon Musk was quick to criticise reports about the accident. Using his preferred soapbox, Twitter, the Tesla founder said it's "super messed up" the accident became front page news, and tried to direct attention toward the fact the driver was left with only a broken ankle in a 100km/h impact.
This crash is the latest in a growing list of semi- and fully-autonomous accidents from the past few months. Along with the highly-publicised death of Eleanor Herzberg in Tempe, Arizona after being hit by a self-driving Uber, a software engineer was killed when his Tesla struck a dividing barrier in California.
Late last year, the US National Transport Safety Board released its report into a fatal Autopilot crash from 2016, where Joshua Brown's Model S collided with a semi-trailer. It said the "driver’s pattern of use of the Autopilot system indicated an over-reliance on the automation and a lack of understanding of the system limitations".
This is a common thread running through Autopilot-related accidents: drivers ignoring warnings and demands to take back control, or leaning too heavily on the autonomous capability of the system.