Can You Program Ethics Into a Self-Driving Car?

When self-driving cars kill, it’s the code (and the coders) that will be put on trial

9 min read
An illustration of a stylized robotic foot on a pedal.
Illustration: Carl De Torres

It’s 2034. A drunken man walking along a sidewalk at night trips and falls directly in front of a driverless car, which strikes him square on, killing him instantly. Had a human been at the wheel, the death would have been considered an accident because the pedestrian was clearly at fault and no reasonable person could have swerved in time. But the “reasonable person” legal standard for driver negligence disappeared back in the 2020s, when the proliferation of driverless cars reduced crash rates by 90 percent. Now the standard is that of the reasonable robot. The victim’s family sues the vehicle manufacturer on that ground, claiming that, although the car didn’t have time to brake, it could have swerved around the pedestrian, crossing the double yellow line and colliding with the empty driverless vehicle in the next lane. A reconstruction of the crash using data from the vehicle’s own sensors confirms this. The plaintiff’s attorney, deposing the car’s lead software designer, asks: “Why didn’t the car swerve?”

Today no court ever asks why a driver does anything in particular in the critical moments before a crash. The question is moot as to liability—the driver panicked, he wasn’t thinking, he acted on instinct. But when robots are doing the driving, “Why?” becomes a valid question. Human ethical standards, imperfectly codified in law, make all kinds of assumptions that engineers have not yet dared to make. The most important such assumption is that a person of good judgment will know when to disregard the letter of the law in order to honor the spirit of the law. What engineers must now do is teach the elements of good judgment to cars and other self-guided machines—that is, to robots.

Keep reading... Show less

Stay ahead of the latest trends in technology. Become an IEEE member.

This article is for IEEE members only. Join the world’s largest professional organization devoted to engineering and applied sciences and get access to all of Spectrum’s articles, podcasts, and special reports. Learn more →

Membership includes:

  • Get unlimited access to IEEE Spectrum content
  • Follow your favorite topics to create a personalized feed of IEEE Spectrum content
  • Save Spectrum articles to read later
  • Network with other technology professionals
  • Establish a professional profile
  • Create a group to share and collaborate on projects
  • Discover IEEE events and activities
  • Join and participate in discussions

Video Friday: Drone in a Cage

Your weekly selection of awesome robot videos

3 min read
A drone inside of a protective geometric cage flies through a dark rain

Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

ICRA 2022: 23 May–27 May 2022, PHILADELPHIA
IEEE ARSO 2022: 28 May–30 May 2022, LONG BEACH, CALIF.
RSS 2022: 21 June–1 July 2022, NEW YORK CITY
ERF 2022: 28 June–30 June 2022, ROTTERDAM, NETHERLANDS
RoboCup 2022: 11 July–17 July 2022, BANGKOK
IEEE CASE 2022: 20 August–24 August 2022, MEXICO CITY
CLAWAR 2022: 12 September–14 September 2022, AZORES, PORTUGAL

Enjoy today’s videos!

Keep Reading ↓ Show less

Remembering 1982 IEEE President Robert Larson

He was a supporter of several IEEE programs including Smart Village

3 min read
A photo of two men in suits.  One behind the other.

Robert Larson [left] with IEEE Life Fellow Eric Herz, who served as IEEE general manager and executive director.

IEEE History Center

Robert E. Larson, 1982 IEEE president, died on 10 March at the age of 83.

An active volunteer who held many high-level positions throughout the organization, Larson was the 1975–1976 president of the IEEE Control Systems Society and also served as IEEE Foundation president.

Keep Reading ↓ Show less

Bridge the Gaps in Your ADAS Test Strategy

Full-scene emulation in the lab is key to developing robust radar sensors and algorithms needed to realize ADAS capabilities

1 min read
Keysight
Keysight

Achieving the next level in vehicle autonomy demands robust algorithms trained to interpret radar reflections from automotive radar sensors. Overcome the gaps between software simulation and roadway testing to train the ADAS / AV algorithms with real-world conditions. Sharpen your ADAS' radar vision with full-scene emulation that allows you to lab test complex real-world scenario, while emulating up to 512 objects at distances as close as 1.5 meters.

Get this free whitepaper now!