Viewing a single comment thread. View all comments

farseer4 t1_j23bufr wrote

Once you accept something like the 0th law, you have carte blanche to commit all kinds of atrocities, in the name of a nebulous "greater good".

4

mickdrop t1_j23siv6 wrote

Once you accept something like the 1st law, you also have carte blanche to commit all kind of atrocities to save more lives. You can justify killing one person to harvest his organs to save 5 persons. That's the trolley problem once again.

3

farseer4 t1_j243jjl wrote

Actually, you can't. The first law prevents robots from hurting any humans. It would take something like the 0th law to allow a robot to kill people in order to save more people.

1

HRDBMW t1_j246teq wrote

If through inaction, 5 humans die instead of one, that violates the 1st law.

3

palparepa t1_j252i6d wrote

But if through action, that one human dies, it still violates the first law. Older robots get completely stumped on facing such a dilemma.

2

HRDBMW t1_j24bad1 wrote

Yes. And I do feel that way. But I also have rules I live by. One of the David Tenent Doctor Who episodes (I think a Good Man Goes to War?) mentioned who needs those rules, and why.

And no, I'm not Dexter.

1