Popular Science

What moral code should your self-driving car follow?

Teaching a robot to make ethical decisions is pretty complicated.
A Google self-driving car

A Google self-driving car

Imagine you are driving down the street when two people — one child and one adult — step onto the road. Hitting one of them is unavoidable. You have a terrible choice. What do you do?

Now imagine that the car is driverless. What happens then? Should the car decide?

Until now, no one believed that autonomous cars — robotic vehicles that operate without human control— could make moral and ethical choices, an issue that has been central to the ongoing debate about their use. But German scientists now think otherwise. They believe eventually it may be possible to introduce elements of morality and ethics into self-driving cars.

To be sure, most human drivers will never

You're reading a preview, sign up to read more.

More from Popular Science

Popular Science4 min read
10 Rules For Picking The Perfect Campsite
Finding the perfect spot to pitch your tent and set up your basecamp can make or break any camping trip.
Popular Science3 min read
New Graphic FDA Warnings Aim To Scare Smokers With The Consequences Of Their Habit
An image of bloody urine isn’t exactly cool. It’s not the kind of thing you’d want to see on your smoke break, right when you’re supposed to be doing something you enjoy. And that’s exactly why the Food and Drug Administration wants to put pictures o
Popular Science3 min readSociety
Your Annual Checkup Could Soon Include Screening For Illicit Drug Use
In a draft document released this week, the U.S. Preventive Services Task Force (USPSTF), which develops guidelines for health screening tests, is recommending that physicians also screen for illicit drug use in patients over the age of 18.