Viewing a single comment thread. View all comments

HRDBMW t1_j22ts9p wrote

I read Asimov so long ago I have forgotten most of it. But what I remember the most is the three laws, and then the zeroith law, as R Daniel dedicates his existence to the survival of mankind. I think that philosophy changed me, and directed me. That I don't matter, other individuals don't matter, not if we threaten the survival of humanity, which is paramount... THAT matters.

4

farseer4 t1_j23bufr wrote

Once you accept something like the 0th law, you have carte blanche to commit all kinds of atrocities, in the name of a nebulous "greater good".

4

mickdrop t1_j23siv6 wrote

Once you accept something like the 1st law, you also have carte blanche to commit all kind of atrocities to save more lives. You can justify killing one person to harvest his organs to save 5 persons. That's the trolley problem once again.

3

farseer4 t1_j243jjl wrote

Actually, you can't. The first law prevents robots from hurting any humans. It would take something like the 0th law to allow a robot to kill people in order to save more people.

1

HRDBMW t1_j246teq wrote

If through inaction, 5 humans die instead of one, that violates the 1st law.

3

palparepa t1_j252i6d wrote

But if through action, that one human dies, it still violates the first law. Older robots get completely stumped on facing such a dilemma.

2

HRDBMW t1_j24bad1 wrote

Yes. And I do feel that way. But I also have rules I live by. One of the David Tenent Doctor Who episodes (I think a Good Man Goes to War?) mentioned who needs those rules, and why.

And no, I'm not Dexter.

1