A Washington state family has filed a wrongful death lawsuit against Tesla following a fatal crash involving the company’s Autopilot system that claimed the life of a 28-year-old motorcyclist. The lawsuit alleges that Tesla’s driver-assistance technology failed to recognize a stationary motorcycle, leading to a deadly rear-end collision in Snohomish County.
The incident occurred in April 2024 on State Route 522 when a Tesla Model S, operating with Autopilot engaged, struck Jeffrey Nissen Jr. while he was stopped on his motorcycle. According to the complaint, the vehicle did not slow down or attempt to avoid the motorcycle before impact. Nissen was pronounced dead at the scene.
Police reports state that the Tesla driver, Carl Hunter, collided with Nissen from behind and continued moving forward after the initial impact, trapping the motorcyclist beneath the car. When Hunter first contacted emergency dispatchers, he reportedly said he did not know how the crash happened. However, investigators later determined that he had been using Tesla’s Autopilot system at the time of the collision and may have been distracted by his phone.
Hunter allegedly told law enforcement that he relied on the vehicle to manage driving tasks. He was later arrested and charged with vehicular homicide. The lawsuit argues that Tesla’s technology and marketing contributed to a false sense of security that encouraged such reliance.
Simeon Osborn, an attorney representing Nissen’s estate, said Tesla’s system creates unrealistic expectations for drivers. “Tesla designed a system that encourages inattention,” Osborn said. “Drivers are led to believe the vehicle is capable of handling situations that it simply cannot, and the consequences can be devastating.”
The lawsuit also points to evidence suggesting that Hunter had disabled or ignored safety alerts prior to the crash, a behavior attorneys say aligns with a phenomenon known as alarm fatigue. Alarm fatigue occurs when users are exposed to frequent alerts—many of which may not require immediate action—causing them to become desensitized or dismissive of warnings altogether.
Dr. Eraka Bath, a professor of psychiatry at UCLA’s School of Medicine, explained that excessive alerts can reduce their effectiveness over time. “When systems constantly beep or issue warnings, users may begin to tune them out,” Bath said. “Unfortunately, that can mean ignoring alerts during moments of real danger.”
Bath noted that similar patterns have been observed in healthcare environments, where clinicians exposed to frequent false alarms may miss or delay responses to critical warnings.
The lawsuit comes amid increasing scrutiny of Tesla’s marketing practices. Earlier this month, a California judge ruled that the company engaged in deceptive advertising related to its Autopilot and Full Self-Driving features, ordering Tesla to stop promoting them as autonomous driving technologies.
Ryan Calo, a professor specializing in technology and tort law at the University of Washington, said Tesla may face challenges defending how its systems are presented to consumers. “It’s difficult to argue that reliance isn’t foreseeable when a product is called ‘Autopilot,’” Calo said.
Tesla has not publicly commented on the lawsuit. The case adds to ongoing debates over the safety, regulation, and public understanding of advanced driver-assistance systems as they become increasingly common on U.S. roads.





