Viewing a single comment thread. View all comments

[deleted] t1_j74hh9v wrote

Cancel the AI project, some dude on reddit can predict by zip codes. Well, I guess that one is done! (joking!)

Feelings are important? Yes they are and that is why we should have real humans, with real families and real life experience acting as judges and juries, my reasoning follows.

But the Tech sector DOES employ people who fit the culture, just not in the way you suggest. Take a wild guess on how many people employed in Silicon Valley who vote the same way, who feel the same about Trans issues, who feel the same about gun control, who feel the same about Christianity, who feel the same about abortion.

THIS is the key problem, the AI is being developed and maintained exclusively by this group, lets say they make up half of the population - where does that lead?

I feel AI is incredible but I really think it needs to be given bounds, building better mouse traps (or cars, planes, energy generation, crop rotation etc, etc) NOT making decisions directly for human beings.

−1

Fake_William_Shatner t1_j77j8u5 wrote

>Take a wild guess on how many people employed in Silicon Valley who vote the same way, who feel the same about Trans issues, who feel the same about gun control, who feel the same about Christianity, who feel the same about abortion.

They vote the way educated people tend to vote. Yes -- it's a huge monoculture of educated people eschewing people who ascribe light switches to fairy magic.

>THIS is the key problem,

No, it's thinking like yours that is the key problem when using a TOOL for answers. Let's say the answer to the Universe and everything is 42. NOW, what do you do with that?

>NOT making decisions directly for human beings.

That I agree with. But not taking advantage of AI to plan better is a huge waste. There is no putting this Genie back on the bottle. So the question isn't "AI or not AI" the question is; what rules are we going to live by, and how do we integrate with it? Who gets the inventions of AI?

It's the same problem with allowing a patent on DNA. The concept of the COMMON GOOD and where does this go in the future has to take priority over "rewarding" someone who owns the AI device some geek made for them.

1