Viewing a single comment thread. View all comments

mickdrop t1_j23siv6 wrote

Once you accept something like the 1st law, you also have carte blanche to commit all kind of atrocities to save more lives. You can justify killing one person to harvest his organs to save 5 persons. That's the trolley problem once again.

3

farseer4 t1_j243jjl wrote

Actually, you can't. The first law prevents robots from hurting any humans. It would take something like the 0th law to allow a robot to kill people in order to save more people.

1

HRDBMW t1_j246teq wrote

If through inaction, 5 humans die instead of one, that violates the 1st law.

3

palparepa t1_j252i6d wrote

But if through action, that one human dies, it still violates the first law. Older robots get completely stumped on facing such a dilemma.

2