The National Transportation Safety Board (NTSB) said that Tesla had “basic safety issues” before it could sell vehicles with functional options such as semi-automatic driving technologies such as Autopilot and FSD and flood them on public roads. It states that it needs to be resolved.

The FSD option offered by Tesla is subject to conditions that limit its self-driving capabilities and requires the driver to always hold the steering wheel to resume driving. Of course, Tesla also wants to finally realize a robot taxi function that allows the driver to completely leave the driving, but at least now it has not reached such a level.

Elon Musk and Tesla have been sporadically reported accidents (including deaths) that appear to have occurred while using Autopilot or FSD, yet these features are more than a living driver.

Also, even in the beta version of FSD, instead of the general development method using a professional driver in a safe place like a test course, the general public is provided with imperfect functions and used, so to speak, a “human pillar”.

In response to this situation, the new NTSB chairman Jennifer Omendi said that Tesla “Autopilot”, “although it is just a driving support function, as experts have already pointed out for several years.”

Continuing to use a name such as “Full Self-Driving” is “misleading and irresponsible,” and states that “some people misuse or deliberately abuse this feature.” It was also unpleasant for Tesla to offer a beta version of FSD to the general public for testing on public roads.

These statements by Chairman Omendi do not necessarily immediately limit or ban Tesla’s proprietary technology development practices. However, this seems to indicate the direction of the NTSB’s response to Tesla under the Biden administration.

If the number of collisions and other accidents while using semi-automatic driving increases in the future, it is possible that the NTSB will force this EV maker to take stricter measures than ever before.

Categorized in:

Tagged in: