You know that you need to be safe when you’re driving, which is why you love the idea of having a vehicle with safety technology. Some cars can stop you before hitting something. Others can let you know if someone is in a blind spot next to your vehicle.
The technology is amazing, but it’s not perfect. Drivers still need to pay attention or else they could end up becoming victims in crashes that could have been avoided.
It’s a world where automated vehicles are on the horizon, but that doesn’t mean they’re always going to be safe. A fatal crash occurred, according to a Feb. 25 report, involving a Tesla and a driver who may have been distracted. The story reports that the driver was on a Silicon Valley freeway when they lost control and steered into a concrete barrier.
It’s believed that the Tesla may have been operating under conditions that it wasn’t made to handle, but driver error may have played a role as well. The National Transportation Safety Board (NTSB) has recommended that Tesla make changes so that autopilot turns off if the car is in conditions that the program wasn’t designed for. Without additional driver-monitoring safeguards, it’s possible that autopilot could be misused and that more crashes could occur in the future.
In crashes like this, it may be possible to hold a manufacturer or designer liable for injuries or deaths caused, especially if a product does not do what it’s designed to do. Car crashes in autopilot are certainly rare and new, but situations like the above show why it’s important to have more research before bringing vehicles like these to the roads.