MIT Moral Machine



Summary: Moral machine is a ‘platform for gathering a human perspective on moral decisions made by machine intelligence.’ The user is presented with moral dilemmas and has to decide which of multiple actions seems more (morally) acceptable.

See also: Thinking about self-driving cars.


Related:  If Buddhist Monks Trained AI