Print Friendly, PDF & Email

The Ethics of Self-driving Cars

Self-driving cars are already cruising the streets today. And while these cars will ultimately be safer and cleaner than their manual counterparts, they can’t completely avoid accidents altogether. How should the car be programmed if it encounters an unavoidable accident? Patrick Lin navigates the murky ethics of self-driving cars. This lesson plan is suitable for use in middle school and high school.

Video prompt (4:15 min):

Potential discussion questions:

  1. In the situation described, would you prioritize your safety over everyone else’s by hitting the motorcycle?
  2. In the situation described, would you minimize danger to others by not swerving? if so, you would hit the large object and potentially die.
  3. In the situation described, would you take the middle ground by hitting the SUV since it’s less likely the driver will be injured? Compare this to hitting the motorcycle.
  4. What should a self-driving car do?
  5. What is the difference between a ”reaction” (human driver’s split second response) and a “deliberate decision” (driverless car’s calculated response)?
  6. Programing a car to react in a certain way in an emergency accident situation could be viewed as premeditated homicide. Do you think this is a valid argument? Why or why not?

*This is inspired from a TED Ed lesson by Patrick Lin.


This lesson plan was contributed by Debi Talukdar.