37C3 – Back in the Driver’s Seat: Recovering Critical Data from Tesla Autopilot Using Voltage Glitch

via https://www.youtube.com/watch?v=AgC9OiFrIPk

Tesla’s driving assistant has been subject to public scrutiny for good and bad: As accidents with its “full self-driving” (FSD) technology keep making headlines, the code and data behind the onboard Autopilot system are well-protected by the car manufacturer. In this talk, we demonstrate our voltage-glitching attack on Tesla Autopilot, enabling us root privileges on the system.

Apart from building electric vehicles, Tesla has gained a reputation for their integrated computer platform comprising a feature-rich infotainment system, remote services through Tesla’s Cloud and mobile app, and, most notably, an automated driving assistant. Enabled by a dedicated arm64-based system called Autopilot, Tesla offers different levels of “self-driving”. The “full self-driving” (FSD) is provided to specific customers via in-car purchases and has been subject to public discourse.

Despite using multiple cameras and Autopilot’s machine learning (ML) models, accidents persist and shape FSD reporting. While the platform security of Autopilot’s hardware protects the code and ML models from competitors, it also hinders third parties from accessing critical user data, e.g., onboard camera recordings and other sensor data, that could help facilitate crash investigations.

This presentation shows how we rooted Tesla Autopilot using voltage glitching. The attack enables us to extract arbitrary code and user data from the system. Among other cryptographic keys, we extract a hardware-unique key used to authenticate Autopilot towards Tesla’s “mothership”. Overall, our talk will shed light on Autopilot’s security architecture and gaps.

Before delving into Autopilot, we successfully executed a Tesla Jailbreak of the AMD-based infotainment platform and presented our attack at BlackHat USA 2023. This achievement empowered custom modifications to the root file system and temporarily facilitated the activation of paid car features.

Niclas Kühnapfel
Christian Werling
Hans Niklas Jacob – hnj


#37c3 #Security