A Self-Driving Crash Test

In honor of anyone who just took a bar exam, here's the wholly hypothetical scenario I used at last week's excellent multidiscliplinary workshop on road vehicle automation (with slight modifications):

The year is 2020. Several automakers now sell vehicles equipped with highly advanced driver assistance systems. Under optimal conditions, these systems generally guide the vehicle both longitudinally (forward) and laterally (sideways) over long distances. Although every owner’s manual states that the driver must remain alert at all times, the media quickly dub these vehicles “the first publicly available self-driving cars.” One automaker, Genius Car Co., shows ads in which a grateful customer recounts how, when she had a heart attack behind the wheel, her Genius NeverCrash automatically pulled to the side of the road, safely stopped, and called an ambulance.

Paul is one of 100,000 eager buyers of the NeverCrash. He feels so confident in his car’s driver assistance features that he doesn’t think twice about driving while drowsy. Late one night, while crossing a long and foggy causeway, he engages his NeverCrash’s automated system, relaxes as that system expertly takes over, and quickly though unintentionally falls asleep. The system tries to alert Paul with various vibrations and sounds, but Paul merely begins dreaming of Los Angeles.

Because Paul does not react, the NeverCrash automatically activates its hazard lights, pulls to the side of the causeway, and stops next to a “No Stopping” sign. Omniscience Inc., the data provider for Paul’s subscription navigation service, has coded this five-foot shoulder as an “emergency safe stop area.” However, at least one competitor’s vehicle—which uses a different dataset—would have attempted to drive to the end of the causeway before stopping.

While stopped, Paul’s NeverCrash is struck by a car driven by Julie. The NeverCrash anticipates the crash but does nothing to prepare. The latest version of the NeverCrash software would have adjusted Paul’s seat and seatbelt; Genius could have remotely added this functionality to Paul’s vehicle, but it has not done so. The crash injures both Paul and Julie (who each have car insurance). It also generates a heated argument between them, the audio and video of which is automatically recorded by Paul's NeverCrash and transmitted to Omniscience.

News of the crash (and several somewhat similar crashes) reaches the public and eventually catches the attention of a partner at the law firm Classy & Classact.

  1. What’s missing? What else would you want to know?
  2. What is our universe of actors?
  3. What contractual and statutory relationships exist?
  4. What are potential legal claims, defenses, and arguments?
  5. What kinds of discoverable documents might exist?
  6. What steps should the parties take?
  7. What steps should their lawyers take?

Analysis is welcome in the comments below.

Photo Credit: Joe Shlabotnik

Comments

In Australia I wouldn't be holding much hope for Julie's legal team.
It'd be interesting to know why Julie crashed into him.
Mat.

Not a lawyer, but I study automated vehicles. Did the Nevercrash record audio and video of the crash? From that we could determine visibility and possibly Julie's speed and any evasive action she may have taken. I'd also like to know details of the argument. Did anyone admit guilt?
When are the papers from the Workshop going to be published?

Add new comment