2.9 million Tesla cars have an investigation after owners report accidents with self-management mode: Details

The National Highway Traffic Safety Administration opened another investigation into electric car manufacturer Tesla’s self-management function on Thursday. The latest investigation stems from customer reports where Tesla cars in self-management mode violated red light or drove to the wrong side of the road, sometimes crashed into other vehicles and caused injuries. In a filing, the regulator said it began an investigation into 58 incidents where Tesla cars apparently violated traffic safety laws when their full self-management mode was executed. The incidents caused as a dozen accidents and fires and nearly two dozen injuries. The agency further said that the investigation will be conducted on all 2.9 million Tesla vehicles. In the new investigation, regulators reported that many of the Tesla drivers were involved that the cars gave them no warning about the unexpected behavior. “Although the behavior that is being investigated is most common in intersections, NHTSA investigation will include any other types of situations in which this behavior may arise, such as traveling along a lane of traffic traffic, or as they approach rail transitions,” the NHTSA said in a statement. According to a report from NCB News, Tesla vehicles failing to use the FSD software often for train tracks and mismandle situations at railroad crossings. They continue to advise, even if red lights flash and hatches lower. Tesla has faced several investigations, and NHTSA also opened an investigation earlier this year for a ‘Summon’ technology that allows drivers to let their cars go to their location to pick it up after reports of various fender benders in parking lots. Another investigation was launched by the agency in August to see why Tesla apparently did not report collapses immediately as needed. The same month, a jury in Miami found that Tesla was partly responsible for a deadly accident in Florida in Florida, which involves its Autopilot Driver Assist Technology-which is different from full self-management-and the victims have to pay more than $ 240 million to damages. Tesla said it would appeal the decision. The FSD system investigated is what is called the Level 2 manager assistant software, which requires managers to pay full attention to the road. A new version of FSD was launched earlier this week. The company also tests a very upgraded version that does not need the driver’s intervention, something that Musk has been promising for years. (With agency input)

Exit mobile version