A pair of researchers at The University of North Carolina at Chapel Hill is challenging the findings of the team that published a paper called "The Moral Machine experiment" two years ago. Yochanan Bigman and Kurt Gray claim the results of the experiment were flawed because they did not allow test-takers the option of choosing to treat potential victims equally.
Back in 1967, British philosopher Phillippa Foot described the "Trolley Problem," which presented a scenario in which a trolley raced toward people on the tracks who were about to be killed. The trolley driver has the option of taking a side track prior to hitting the people—however, that track is populated, as well. The problem then poses moral quandaries for the driver, such as whether it is more reasonable to kill five people versus two, or whether it is preferable to kill old people versus young people.
Two years ago, a team at MIT revisited this problem in the context of programming a driverless vehicle. If you were the programmer instead of the trolley driver, how would you program the car to respond under a variety of conditions?
The team reported that, as expected, most volunteers who took the test would program the car to run over ...
https://techxplore.com/news/2020-03-moral-machine-reexamined-forced-choice-reveal.html
Back in 1967, British philosopher Phillippa Foot described the "Trolley Problem," which presented a scenario in which a trolley raced toward people on the tracks who were about to be killed. The trolley driver has the option of taking a side track prior to hitting the people—however, that track is populated, as well. The problem then poses moral quandaries for the driver, such as whether it is more reasonable to kill five people versus two, or whether it is preferable to kill old people versus young people.
Two years ago, a team at MIT revisited this problem in the context of programming a driverless vehicle. If you were the programmer instead of the trolley driver, how would you program the car to respond under a variety of conditions?
The team reported that, as expected, most volunteers who took the test would program the car to run over ...
- animals rather than people,
- old people instead of young people,
- men rather than women, etc.
https://techxplore.com/news/2020-03-moral-machine-reexamined-forced-choice-reveal.html
No comments :
Post a Comment