After a fatal car crash that made multiple news headlines back in May, Tesla Motors Inc. is making some major changes to how the Autopilot system works in its company’s vehicles. The revisions will force Tesla vehicles that use Autopilot to depend more on specific radar signals that help the cars stay safely on the road. Crucial safeguards that will keep drivers engaged at high speeds are also being added.
Tesla’s Autopilot feature, which uses radar, censors, and cameras to automatically steer vehicles and adjust speed, has been facing a great deal of scrutiny since the car accident that killed Joshua Brown in May. Brown was the first Autopilot driver to experience a fatal crash while behind the wheel. The accident occurred after the vehicle failed to brake automatically after it was unable to distinguish between a truck’s white trailer and the bright sky.
In a press conference delivered in September, Tesla Chief Executive Elon Musk outlined the planned revisions and reassured reporters that the revisions are a “threefold improvement in safety”. “This is not going from bad to good,” he said. “It’s going from good to, I think, great.” He also admitted that the revisions might have prevented the fatal crash in May.
Tesla’s Autopilot system is regarded as a major step towards the future reality of self-driving cars, but Tesla continues to warn drivers that the technology doesn’t make the vehicles 100% autonomous. Drivers must be ready to take over control at any time even while using Autopilot. Vehicles even alert and warn the driver if their hands are not on the steering wheel to ensure this.
Some driver safety experts feel that encouraging drivers to let the vehicle take control speaks in a direct contraction of sorts against these warnings, however. If a driver must be prepared to take over 100% of the time and be ready to avoid an accident in case of a device failure, doesn’t that take away from the intention of the device in the first place?
In order to prevent a car accident—especially on a busy interstate highway such as in the Louisville, Kentucky area—actions must be taken quickly. Instantly in most cases. If a driver is distracted in any way, shape, or form, this could be lead to potential disaster—as witnessed in May’s accident. Autopilot seems like it’s a useful feature that can indeed lead to the reality of self-driving cars that are 100% safe, but unless the feature is guaranteed to be safe, having it available to the public may be dangerous.
Consumer Reports, for example, stated that they felt Autopilot was “too much autonomy too soon.” With technology as advanced as this, it’s almost inevitable that drivers will want to take advantage of it. If the technology isn’t ready to do what Tesla claims it will, that fact alone makes it potentially dangerous.
There’s also the fact that once a driver of a Tesla vehicle begins to feel comfortable with how Autopilot functions, it’s easier to become distracted/want to let the car fully take over. This can also lead to accidents. Musk admitted during the press conference that this is indeed an issue and that Autopilot accidents are more likely among expert drivers than novices. The company is looking for ways to avoid issues like this. Currently, the system alerts the driver if their hands are not on the wheel, but more warnings may be needed.
With any luck, these revisions will make Tesla Autopilot safer. We should know soon. The revisions are planned to start rolling out soon. In the meantime, if you’ve been involved in a car accident involving a Tesla vehicle and feel the vehicle malfunctioned or was defective, you may potentially be able to recover some of the costs from the accident. Contact Tad Thomas, Louisville, KY product liability lawyer, and his expert team of personal injury attorneys to find out how.
Innovation is Part of Thomas Law Office’s DNA