Another YouTuber has taken on the challenge of testing Tesla’s Full Self-Driving (FSD) system against a deceptive fake wall test following a viral video that raised concerns about the technology’s ability to detect certain hazards. Kyle Paul, a content creator, conducted his own experiment and posted the results on Thursday, featuring two Tesla models with different hardware and software configurations.
Paul’s test aimed to evaluate whether Tesla’s FSD could recognize and stop for a fake wall painted to resemble an open road—a scenario made famous by YouTuber Mark Rober’s initial experiment. However, Rober’s test faced criticism because he used Autopilot rather than FSD.
Paul reran the tests with similar conditions, but instead conducted the experiment with FSD. He used a Tesla Model Y with an HW3 computer running FSD version 12.5.4.2, as well as a Cybertruck equipped with the latest HW4 / AI4 system and FSD version 13.2.8. The results varied: the Model Y, like Rober’s test vehicle, failed to recognize the obstruction and required manual intervention to avoid a collision. However, the Cybertruck successfully detected the wall and stopped before impact, demonstrating an improvement with the newer hardware and software.
The test adds to ongoing discussions about the capabilities and limitations of Tesla’s FSD system as it continues to evolve. The results highlight both the potential and the current gaps in Tesla’s autonomous technology, particularly in unusual or misleading visual scenarios. As Tesla refines its driver assistance features, these independent tests provide valuable insights into how the system performs outside controlled environments.