Tesla is currently testing its advanced driver assistance “Autopilot” in a “Full Self-driving” beta. Now there are increasing reports from test participants about dangerous, spontaneous braking processes that are carried out for no apparent reason.
Phantom braking a dangerous problem
Assistance systems are now pretty good at taking over most of a driver’s tasks – but by no means all of them. Tesla is the most ambitious with its “Autopilot” and has so far created the largest visual database with its fleet. Now a serious mistake seems to have crept into this sensitive system. As Electrek reports, the company is seeing an increasing number of complaints about serious and dangerous phantom braking among users of the Full Self-Driving (FSD) beta.
This describes braking processes that are triggered by driver assistance and self-driving systems for no apparent reason. Most of the time, the detection systems incorrectly suspect obstacles on the road, and the vehicle then tries to avoid a collision on the basis of this incorrect database. This phenomenon was already known to Tesla drivers in a much milder form and frequency, but the situation seems to have worsened dramatically since the last software updates.
Increase in Reports
The US National Highway Traffic Safety Administration (NHTSA) recorded a sharp increase in reports from Tesla owners reporting dangerous phantom braking last month. The descriptions are the same: The cars perform emergency braking on the open road for no reason, the risk of a rear-end collision is of course extremely increased. Electrek can understand these descriptions, since the software update 2021.40 the problems have even worsened.
Manager at Research Snipers, RS-NEWS, Digital marketing enthusiast and industry professional in Digital technologies, Technology News, Mobile phones, software, gadgets with vast experience in the tech industry, I have a keen interest in technology, News breaking.