Autonomous Vehicle Ethics
[index]
[2,352 page views]
I came across an article on Wired today that I found somewhat interesting regarding ethics and autonomous vehicles. Essentially, if an autonomous vehicle determines that a crash is unavoidable ... how should it be programmed to handle the situation? This one is somewhat akin to the Trolly Problem. The difference, however, is that decision is made on the fly by a subjective human whereas an autonomous vehicle will be following a deterministic set of rules that basically determines who is least valuable. Does that qualify as pre-meditated murder? Would programmers be liable?