Autopilot and autonomous driving are coming soon; there’s no doubt about that. What there is some doubt about is how quickly the technology can be implemented and most importantly, how safe it is. Recent incidents with the Tesla Model X SUV self-driving technology has raised some new questions about high-tech on the roadways, as well as the role of the operator in handling a car with an autopilot feature.
Pennsylvania - A Model X crashed on the Pennsylvania turnpike on July 1, causing the vehicle to roll after hitting a guard rail. The driver claimed to be using the autopilot feature at the time. According to Tesla, the autopilot feature disengaged after several visual, and auditory cues (the steering wheel also vibrates to alert the driver when this happens).
These cues tell the driver they need to take control of the vehicle or it will shut down, which is what appears to have happened in this case. There was a similar incident that occurred in Montana, which, according to Tesla, resulted from a similar situation as the Pennsylvania crash. No one was hurt in either crash.
Ohio - In July, the driver of a Tesla Model S was killed when he hit a semi-truck that was turning in front of it. The autopilot was engaged at the time and the car went under the semi and, according to reports, the driver was watching a movie on a portable DVD player at the time of the tragedy.
Tesla Moving Forward In Spite Of Controversy
There is no doubt that driving, as we know it, is about to change in a big way and Tesla Motors will no doubt be a major player (if not leader) in that transition. However, the current state of “self-driving” technology is still in its infancy as CEO Elon Musk repeatedly states. (The autopilot feature is called a beta for a reason, he says.)
Tesla has been quick to defend itself against critics in the past, having had particular success with information gleaned from onboard computer logs. In the instances mentioned above, they claim that autopilot was either not engaged, or not used appropriately.
While they’re likely right that user error was a factor in these crashes, you can forgive the general public for over-trusting a vehicle that not only drives and steers for you but is capable of changing lanes for you, too. What does this mean going forward?
Mainly, that Tesla needs to do a better job of explaining what the feature does and does not do. Another crash in Beijing even caused the company to remove the term “self-driving” from their Chinese website. An understandable correction is given that the term means automatic piloting of your vehicle. Tesla will no doubt have issues to address and questions to answer going forward, but they show no signs of slowing down: they recently received a massive round of funding to continue expansion and growth in the electronic vehicle industry.