Ads
Home News Tesla Full Self-Driving Mode Allegedly Tries to Drive Into Speeding Train

Tesla Full Self-Driving Mode Allegedly Tries to Drive Into Speeding Train

Tesla Full Self-Driving Mode Allegedly Tries to Drive Into Speeding Train

A recent incident involving Tesla’s Full Self-Driving (FSD) Beta has raised significant safety concerns after a Tesla Model 3 allegedly attempted to drive into the path of an oncoming train. This alarming event, captured on video, highlights the potential risks associated with Tesla’s autonomous driving technology, which is still in its beta testing phase.

Details of the Incident

The incident occurred in Denver, Colorado, during a test drive of Tesla’s FSD Beta by a YouTube channel known for reviewing advanced driver assistance systems. According to the video, the Tesla Model 3, equipped with the latest FSD Beta, was navigating city streets when it attempted to make a left turn in front of a speeding train. Despite detecting the train on its display, the car initiated the turn, forcing the driver to take immediate control and avert a collision​.

Tesla’s Full Self-Driving System

Tesla’s FSD Beta is a Level 2 advanced driver assistance system, which means it requires constant supervision from the driver, who must keep their hands on the wheel and be prepared to take over at any moment. The system is designed to handle driving tasks such as lane changes, turns, and traffic light recognition, but it is not fully autonomous​​.

Public and Ethical Concerns

The incident has sparked a broader debate about the ethics of testing such advanced technology on public roads. Unlike other companies that test autonomous systems in controlled environments, Tesla has opted to deploy its FSD Beta to regular customers, who effectively become testers for the technology. This practice has been criticized for potentially putting both the drivers and the public at risk​.

Previous Incidents and Legal Challenges

This is not the first time Tesla’s FSD Beta has come under scrutiny. There have been multiple reports of the system failing to perform safely, leading to near-misses and actual collisions. A notable case involved a Tesla employee who died in a crash while using FSD Beta, although intoxication was a contributing factor in that incident​

Tesla’s Response

Tesla has continuously updated its FSD software, aiming to improve safety and reliability. The company recently rolled out FSD Beta v12, which features end-to-end neural networks intended to enhance the system’s decision-making capabilities. However, despite these advancements, the system remains in beta, and Tesla emphasizes the need for driver vigilance​​.

The recent incident where a Tesla in FSD mode nearly collided with a train underscores the ongoing challenges and risks associated with autonomous driving technology. While Tesla continues to innovate and improve its systems, the company faces increasing scrutiny over the safety and ethical implications of its testing practices. As autonomous driving technology evolves, ensuring robust safety measures and clear communication with users will be critical to gaining public trust and acceptance.

NO COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Exit mobile version