A recent court case in China is drawing attention to a growing safety problem with Level 2 driver-assistance systems: some drivers are trying to defeat the very safeguards designed to keep them alert.
The case involves an electric vehicle owner using a Level 2 assisted driving feature similar in concept to Tesla’s FSD (Supervised). These systems can help with steering and speed on certain roads, but they are not self-driving. They require the driver to stay in the driver’s seat, remain attentive, and be ready to take control at any moment. To enforce that, many cars issue repeated reminders and warnings if the system believes the driver isn’t engaged—often by checking for hands-on-wheel input.
According to the court filing, the driver, identified as Wang Mouqun, had a previous drunk driving conviction and wanted to bypass the constant “pay attention” prompts. He allegedly installed an aftermarket “smart driving device” intended to trick the car into thinking the steering wheel was being touched every couple of minutes. That kind of spoofing prevents the usual sequence of visual and audio alerts that would normally escalate when a driver fails to provide hands-on confirmation.
The timeline described in the filing is troubling. Around 00:30 on September 13, 2025, Wang drove after consuming alcohol from near a restaurant in Tangqi Town, Linping District, Hangzhou, Zhejiang Province, back to his residential community. At approximately 01:15, he drove again, activated the vehicle’s assisted driving function, set a destination, and used the privately installed device to evade monitoring. Then, instead of supervising the drive, he moved to the passenger seat and fell asleep.
While he slept, the vehicle continued traveling under driver assistance, falsely “reassured” by the accessory simulating wheel contact. Eventually, the driver-assist feature disengaged and the vehicle came to a stop on a local road. Locals noticed the unusual scene—a car pulled over with a person apparently asleep in the passenger seat—and alerted police.
Authorities later determined the driver’s blood alcohol level exceeded the legal limit. It was also his second drunk-driving offense in less than two years, which significantly worsened the consequences.
What sets this case apart is not only the drunk driving, but the deliberate effort to tamper with a Level 2 driver-assistance system despite being aware of the system’s limitations when purchasing the vehicle. The court convicted him for driving under the influence and for modifying the system in a way that undermined its safety controls. He received a fine along with a one-and-a-half-month prison sentence, which he is currently serving.
The broader takeaway for EV owners and anyone using advanced driver-assistance features is clear: Level 2 autonomy does not mean hands-free, eyes-off driving. Aftermarket devices designed to defeat driver monitoring can turn a helpful safety technology into a serious hazard—especially when paired with reckless decisions like impaired driving.






