Pressured by lawsuits accusing misleading advertising, Elon Musk has opened up the Self-Driving feature to Tesla car owners eager to test it out. But not all users are aware of the details of the launch announcement, informing that it is not a replacement for the driver who must remain vigilant and ready to take the wheel at any time.
Without absolving the reckless driver, a Tesla Model S car driven in the recently announced Full Self-Driving mode by Elom Musk’s company suddenly braked just as it entered an underpass, the result of a bug triggered in the car’s software. The problem reported by other Tesla owners manifests itself in a “panic” reaction of the computer left in the driver’s seat, manifested by emergency braking regardless of speed or road conditions.
In this case, the bug manifested itself at the worst possible moment, right when entering an underpass while the car was at the head of a column of vehicles moving at high speed. Combined with the blinding effect of entering the artificially lit tunnel, other drivers failed to notice the car stopped in the middle of the road in time, resulting in a multi-vehicle pile-up with several casualties, including a two-year-old child.
When questioned by police, the driver admitted that he was using Tesla’s newly launched Self-Driving feature. Apparently the accident was not caused by careless driving, as the Tesla Model S braked violently without the driver being able to intervene to override the control. Apparently, the car’s computer had just initiated a left turn signal, suggesting an intention to change lanes.
Announced on November 24, just hours before the incident, Full Self-Driving had been accessed on more than 285,000 Tesla cars by the end of last year. Previously, there have been more than 35 documented traffic accidents involving Tesla cars driven in autonomous mode, 35 of which are directly associated with the activation of Full Self-Driving.
The term “Full Self-Driving” has been criticized by other automakers and industry groups as misleading and even dangerous. Last year, rival autonomous driving technology company Waymo, owned by Google, announced it would no longer use the term.
“Unfortunately, we see some automakers using the term ‘autonomous driving’ in an inaccurate way, giving consumers and the general public a false impression of the capabilities of (not fully autonomous) driver assistance technology,” Waymo wrote in a blog post . . “That false impression can cause someone to unknowingly take risks (like taking their hands off the wheel) that could endanger not only their own safety, but the safety of people around them.”