Tesla's Full Self-Driving has raised concerns among U.S. government regulatory agencies that are focused on the safety of highways and transportation. The company's advanced driver-assist system has been installed in its electric vehicles for quite some time, but was more recently unveiled as Full Self-Driving. The current version is labeled as beta software, but Tesla car owners that install FSD can use the feature without any delay or supervision.
Autopilot has been an option for Tesla cars since 2014 and every car made since September of that year came with the necessary hardware preinstalled. Then in October of 2016, Tesla began building cars with even more advanced equipment that would enable Full Self-Driving. This includes eight cameras, twelve ultrasonic sensors, and forward-facing radar. The radar only covers the front while the other technology covers all around the vehicle. The need for radar for the front is based on vehicles traveling forward at much higher speeds and the need to see further ahead than in any other direction. In 2019, another self-driving update brought a more powerful computer, which included a system-on-a-chip designed by Tesla and specifically to manage FSD.
Tesla is one of the leading developers of self-driving solutions, but since it is known as a company that moves quickly and takes risks, this creates a bit of a problem for government agencies tasked with ensuring vehicle safety. CNBC recently reported on a letter sent from the National Transportation Safety Board (NTSB) to the National Highway Traffic Safety Administration (NHTSA) requesting a review of Tesla’s FSD system and policies. In particular, calling attention to a lack of overall preparedness for the increasing number of advanced driver-assist and autonomous systems.
Tesla does refer to its Full Self-Driving feature as beta software, which implies that it is unfinished, not ready for everyone to use, and could have bugs. Perhaps the NTSB is correct to question whether it is safe enough for use without restrictions. There have been accidents in which the driver was using the Autopilot feature at the time of the collision. Often the driver confesses that they weren’t alert enough at the time of the crash and Tesla does specify that the human driver should be ready to take over at any moment, resulting in the possibility that these driver-assist features are being over-hyped. Even the names ‘Autopilot’ and ‘Full Self-Driving’ imply to varying degrees that the driver can take a break. The pace of Tesla software updates indicates that problems are addressed immediately and the fact that more incidents haven't occurred shows that Tesla is not being completely irresponsible. That said, it is within the purview of the government to ensure traffic safety, not that of a business.
Autopilot is the term used for the computer control systems in large aircraft that can handle nearly all aspects of flying the plane, including takeoff and landing. In comparison, Tesla’s Autopilot is far more limited. Even its Full Self-Driving is not as capable on the road as flight computers are in the air. Part of the reason for the difficulty is that the ground is much more crowded than the sky. Cars, trucks, trailers, motorcycles, bicycles, and pedestrians are commonplace, along with the rigid traffic rules and physical barriers that challenge an autonomous vehicle at almost every turn in the city. Tesla’s Autopilot operates on highways and freeways with good accuracy, but does make errors. Perhaps the jump to FSD on surface streets is too quick and some regulation may be called for. While everyone will want the autonomous future Tesla advertises to arrive as quickly as possible to remove the burden of long drives through busy traffic, if some monitoring and approval processes delay progress, it may be worth it to potentially save lives and prevent damage.
Source: CNBC
from ScreenRant - Feed https://ift.tt/3vvesnz
0 Comments