Stanford CIS

Autonomous Navigation: How To Drive Neighborhoods Crazy

By Patrick Lin on

In the first of this two-article series, we saw how augmented reality (AR) is causing friction between individual liberty and public interest.  AR appmakers are being required by some parks to obtain a permit before they can “put” virtual objects in those public spaces, given the sudden crowds the apps can cause.

This article looks at the same core dilemma with another technology: automated driving.

In development by private companies such as Mercedes Benz, Ford, Volvo, Waymo, Tesla, and many others, self-driving cars are operating on public roadways, driving alongside other road users.  This means that the products don’t just directly affect users alone, as is the case with most other gadgets and services.  What I do on Facebook, for example, usually can’t injure others directly.  So, like some AR apps, these robot cars straddle a tricky line between the freedom to innovate and public safety.

I won’t talk about crash dilemmas with robot cars, especially the “trolley problem”, since that has proven to be easily misunderstood and distracting.  Instead, I’ll talk about everyday dilemmas—navigation decisions, specifically—that are more common and less dramatic, but still challenging for ethics and even democracy.

To be truly self-driving, these cars will need to autonomously select their routes, especially if no one is in the car; and there’s often not just one correct way to go.  One route could be shorter and faster but involve more intersections and turns, while another is lengthier but safer.  This dilemma already exists today in traffic navigation apps, such as Waze; they usually default to faster routes, even to save you a few seconds, unless the user selects an alternate one.  If you’re unfamiliar with the area, then you wouldn’t know which routes may be more treacherous than others.

This means navigation apps are making risk-decisions that users might be unaware of but arguably should be.  And what’s at stake isn’t just the user’s safety.  Waze, for example, is also giving rise to some complaints about “flocking” behavior: swarms of cars sent by its algorithms through quiet neighborhoods not designed for heavy traffic.  This could increase risk to children playing on these streets, lower property values if road noise is louder, and create other externalities.

Read the full piece at Forbes.