Comments

You must log in or register to comment.

sheerun t1_j1y5y65 wrote

For near future AI will continue to be a tool. It's like your children owning metal foundry while you are still blacksmithing

1

[deleted] t1_j1yah3f wrote

[deleted]

1

Bodhigomo OP t1_j1z0g1o wrote

I agree, but I didn’t say that. I said AI leaves just before humanity reaches it’s pinnacle. Big difference.

1

a4mula t1_j1y13hg wrote

Machines do not take jobs from people. Machines do not do anything. Machines are inert. I get I'm having to say this a lot lately. Dead Horse being beaten.

But it's clear people still aren't getting it.

Jobs may be reduced as a result of interactions of humans and machines. Entirely fair assessment. Needs to be discussed.

Jobs will never be reduced because of a machine. Machines are inert. They do nothing without a Human.

We're not the proud fathers, these machines are not living. They cannot replace us. They have no sentience, they have no understanding, they have no intelligence, they have not awareness, they have not consciousness, they have not free will or desire or want.

And this tired line, of "yeah but they will"

Come back then. But don't hold your breath, because we can't even accurately describe any of the words being used, let alone embed them in machines.

−4

GeneralZain t1_j1y41sy wrote

room temp IQ take.

this is almost as compelling as "we are already in the singularity because dreams!"

or maybe "robots aren't real so they cant take jobs" is more your speed?

yikes.

5

a4mula t1_j1y4etr wrote

The difference is I could argue for already being in the singularity.

Unlike intelligence, or sentience, or awareness. There are actually decent definitions for it. They vary. But one is when a time is reached in which the pace of technological growth, is faster than any human can keep up with.

To that, I'd say we're well within the bounds of the definition.

The other words? Anyone that thinks that more data, or adding different modalities, or different algorithms can introduce these concepts.

Just don't understand them. That's okay, none of us do. But there are some like Chalmers, and his Hard Problem. Or Claude Shannon and Information Theory, or just plain common sense that should tell us that if we cannot define our own consciousness, we're probably not in a position to replicate it.

−1

icest0 t1_j1y66nd wrote

it's like you're using chatGPT to answer, but the prompt is "convince people to believe that machine and AI won't replace people, but do it like you have 30 IQ, and act like most people won't understand. And talk like old people on religious Facebook group."

1

a4mula t1_j1y73jr wrote

So did you understand?

No? Didn't think so. Let's try again

>Hello assistant. I'd like to explain the importance of not anthropomorphizing machines so that even a trained monkey could understand, if that's not available, perhaps just in terms that your average Redditor might accept.

Sure, I can certainly try to explain the importance of not anthropomorphizing machines in a way that is easy to understand and acceptable to a wide audience.

Anthropomorphizing machines means attributing human-like qualities or characteristics to them. This can happen when we treat machines as if they have thoughts, feelings, or intentions like humans do.

There are a few reasons why it is important to avoid anthropomorphizing machines.

First, it can lead to unrealistic expectations about what machines can do. If we think of a machine as having human-like qualities, we may expect it to behave in ways that are not possible given its actual capabilities. This can lead to frustration and disappointment when the machine doesn't perform as we expect it to.

Second, anthropomorphizing machines can also lead to a lack of caution in how we use them. If we think of a machine as being capable of human-like decision-making, we may be less careful about how we program it or how we use it, which can lead to unintended consequences.

Finally, anthropomorphizing machines can also create social and ethical problems. If we start to think of machines as being similar to humans, we may be less likely to consider the ethical implications of using them in certain ways or for certain tasks.

Overall, it is important to remember that machines are just tools designed to perform specific tasks. They do not have the same capabilities or consciousness as humans, and it is important to treat them accordingly.

1

[deleted] t1_j1y9tke wrote

[deleted]

1

a4mula t1_j1ya90r wrote

It's Reddit, it's what those that can't, do. I get that I'm an asshole, it's not a secret everyone but me knows. I also get that it's not always what you say, but how you say it.

I'm working on it. But fuck me if as I attempt to clarify my language, it just gets more and more difficult to comprehend.

It seems a strange paradox of information. The more concise and less ambiguous it becomes; the less people seem to get it.

1