In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
AutoBeta
2024-11-17 Update From: AutoBeta autobeta NAV: AutoBeta > News >
Share
AutoBeta(AutoBeta.net)09/20 Report--
Tesla is preparing for a major upgrade to his fully autonomous driving (FSD) software recently, the head of the National Transportation Safety Board (NTSB) said: it may be too early, the Financial Associated Press reported. The main reason is that Jennifer Homendy, chairman of NTSB, believes that Tesla's use of the word "fully autonomous driving" is "misleading and irresponsible". Because consumers pay more attention to marketing than the warnings on the car manual or the company's website. Tesla's behavior will mislead many people to misuse and abuse the technology, and Tesla should not promote this upgrade until the security flaw is resolved.
It is worth noting that Musk said last week that FSD Beta v10.0.1 has been pushed to some users and will be available on a large scale from September 24th. It is understood that the FSD currently sells for $10, 000, and if car owners have paid for the software, they can get the new software by testing the application button. Tesla will ask car owners if they are willing to share driving data. If you have good driving behavior for 7 consecutive days, you will be qualified to test the new software.
The National Highway Traffic Safety Administration (NHTSA) asked 12 car companies to provide data on the "advanced driver assistance system" to help them investigate Tesla's Autopilot autopilot system, according to several media reports. NHTSA asked 12 automakers to list all crashes in which the advanced driver assistance system was enabled "at any time within 30 seconds before the accident". The agency also asked car companies to provide details about the driving assistance system, including how to remind drivers to pay attention to road conditions and how to monitor drivers' conditions. The reason for the investigation into Tesla's Autopilot autopilot assistance system is mainly due to the excessive number of previous cases of Model 3 and other products owned by Tesla, which are suspected of vehicle crashes caused by the Autopilot system, resulting in casualties.
According to documents posted by the National Highway Traffic Safety Administration (NHTSA) on its website, the number of vehicles involved in the investigation is estimated to be 765000, including Tesla's Model Y, X, S and Model 3 models manufactured between 2014 and 2021. The reason for the investigation is that there may be defects in the identification of road conditions, making it difficult to find emergency vehicles parked on the side of the road and their warnings. NHTSA said that after confirming 11 of the collisions, the bureau launched an investigation in which 17 people were injured and one was killed. "all the vehicles involved were confirmed to have used Autopilot or Tesla's traffic perception cruise control system at the time of the collision," it said.
In August this year, Tesla CEO Musk said on Twitter that our goal for FSD is 1000% safer than ordinary human drivers. For Tesla CEO Musk's goal, some netizens said: it may have been achieved in 99.99% of the cases are ten times more accurate than people, but once you encounter that unique 0.01%, it is a fatal accident. Even if there is data to prove that it is 10 times safer than human drivers, they still dare not sleep and let them drive. If they fail once in a while, they will not have a chance to start anew.
Welcome to subscribe to the WeChat public account "Automotive Industry Focus" to get the first-hand insider information on the automotive industry and talk about things in the automotive circle. Welcome to break the news! WeChat ID autoWechat
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
© 2024 AutoBeta.Net Tiger Media Company. All rights reserved.