Viewing a single comment thread. View all comments

its-octopeople t1_izgtgii wrote

Neural network AI, at least as I understand it, performs matrix operations on vectors. We're seeing systems of matrices that are pretty well optimized to their applications, but I'm sceptical you could ever meaningfully describe such a system as sentient. What is weirding me out, however, is that they don't seem to need it. Is sentience even necessary for human level intelligence? If no, what does that mean?

62

4354574 t1_izh2ihh wrote

It doesn't seem to be the case that consciousness is required for intelligence. Solving the Hard Problem of Consciousness is the only way we'll ever really know if a machine is sentient. Otherwise it could just be - and I expect it to be - that a superintelligent AI will be a philosophical zombie, that is, we won't be able to tell if it is conscious or not because it can mimic conscious awareness.

39

Drakolyik t1_izh8ci9 wrote

If something mimics consciousness perfectly, it's effectively no different than being conscious.

We cannot ever truly know if the other people we interact with are fully self aware, or if they're just sufficiently sophisticated organic machines that are mimicing consciousness.

I certainly know individuals that make me question whether or not they're actually conscious of their own decisions. Do they have that recursive learning software that reflects on choices they've made or do they simply run on what amounts to instinct?

17

Taron221 t1_izhitmt wrote

I think it's easy to sidestep the importance of emotions in consciousness because it's sort of a cliché in fiction.

Unsolicited curiosity, personal preferences, trivial secrets, want for recognition, hope for betterment, desire to learn, reflective anxiety, worry for others, and ambition that goes beyond self-preservation. These are all some things we would deem signs of consciousness, yet they all require an emotion. If you took away every single emotion and sentiment a person could feel, they'd probably die of thirst or neglect eventually.

Mimicry would be convincing, but it wouldn't be consciousness--it would just be software pretending it had emotions. Emotions and memories are probably the big two for identity & sentience, while levels of sapience come with intelligence.

16

geroldf t1_izicv5w wrote

Programming emotions into an AI is easy.

2

Taron221 t1_izighro wrote

There are some researchers who have attempted to program AI systems to simulate emotions or respond to human emotional cues---Marcel Just, Rana el Kaliouby, and Rosalind Picard, to name a few.

They have had some success, but emotions, as we comprehend them, involve a complex interplay between the brain, the body, and various hormones and chemicals. It is difficult to quantify if what the researchers are doing is imparting emotions, teaching cues, or, as u/Drakolyik said, simply programming a type of mimicry. Emotions are not fully understood by science.

But, in all likelihood, an AI that is programmed to simulate emotions is not experiencing them in the manner that humans do. That comes with the risk that it might behave in unpredictable, erratic, or harmful ways down the line.

Because of this, some argue that if you really wanted a True AI, a simulated human brain might be safer than a programmed AI. By simulating the structure/function of the human brain, it may be possible to create an AI that is capable of adaptive behavior without needing to program it to behave in certain ways. But that might make it more complex and difficult to understand or manage.

5

Handydn t1_izimcuz wrote

I also think there won't be a True AI until we fully understand how human brain works on a cellular, if not molecular, level. The current neuroscience research is not advanced enough to address these yet. Could AI in turn help with neuroscience research? I don't know

3

geroldf t1_izqpqoa wrote

Emotions are just different states of the machine where different computational priorities are emphasized.

For example in the fear state the emphasis is on safety and escape from danger. In anger it’s on attack. To implement them the weights are changed along the decision tree.

2

Taron221 t1_izv2z7k wrote

Those are purely reactionary definitions of fear and anger, though. Emotions come with a reward/punishment for decisions (guilt, sorrow, shame, embarrassment, etc.). Dopamine and other chemical releases are our reward and punishment whilst genetics & experience are our regulators of the amounts we get for every action. You could probably program a sort of self-calibrating regulator of reactions, which might give a sense of personality, but you can't reward or punish them in the manner you would biological beings.

2

geroldf t1_izvscii wrote

Everything is easy once you know how. We won’t be limited to our current state of ignorance forever.

1

4354574 t1_izhh049 wrote

Yes, the problem of other minds.

I'm simply conjecturing that if/when we do solve the hard problem, we indeed might actually be able to tell.

7

Drakolyik t1_izhq860 wrote

I personally don't think there is a hard problem. I find that is one of the last refuges for spiritual beliefs, hiding behind overcomplication. Consciousness is an emergent spectrum and proponents of the hard problem seem to believe there needs to be some part of the physical body to point at when it's really the sum total of a lot of different parts.

It's similar to how creationists are never satisfied with all of the evidence in favor of evolution. Always asking for a new missing link between the missing links we've already found. It'd be great to have a complete accounting of all parts of the evolution of species, but that isn't happening and consciousness is likely similar.

10

4354574 t1_izhr173 wrote

What proof do you have that consciousness is an emergent property of the brain? If you don't have proof, then don't frame it as a statement.

And even if consciousness emerges from the brain, there is still the huge tiny issue that we have *no idea* how electrical impulses become thoughts and emotions.

As for myself, I have seen and experienced far too many phenomena that we can't explain unless consciousness is nonlocal, so there's no point in trying to convince me otherwise.

4

Drakolyik t1_izhzidf wrote

I clearly said "I personally think...", Which means it's my opinion.

However, to say we have no idea how consciousness is probably an emergent property of different systems in the brain is kind of just ignorant of current knowledge in neuroscience.

It's like all this ancient aliens shit when anthropology has a pretty good idea of how human beings created the pyramids.

9

Entalstate t1_izrr9b8 wrote

Neuroscience doesn't have shit. A better analogy would be to say physicist have a pretty good understanding of God. Of course, that is nonsense, but no more so than the idea that neuroscientists have the foggiest idea of how subjective reality exists.

2

morderkaine t1_izi36un wrote

What proof do you have that it isn’t? The brain is all there is that thinks and makes us who we are and let’s us control our bodies. With the lack of anything else, consciousness can only be from there.

3

4354574 t1_izr0px7 wrote

No proof, just so much experience with psychic phenomena that it's mundane - except it can only be explained by a nonlocal mind. Or I'm really crazy.

So, crazy it is, eh?

Also, paranormal research meta-studies show a slight positive affect, indicating something interesting is going on. You won't find that on Wikipedia, though: the tiny cadre of editors that act as the gatekeepers of anything to do with the paranormal are hardcore skeptics who quickly delete any evidence from studies that others try to add.

Also, the only theory of mind that has any empirical evidence can be interpreted as allowing for a nonlocal consciousness.

Roger Penrose is probably the most brilliant person alive and he says that we need a new type of physics to explain consciousness.

"I don't believe in any religion I've seen, so in that sense I'm an
atheist. However, [...] there is something going on that might resonate with a religious perspective".

- Penrose

Basically, the kind of dismissiveness with which the subject of consciousness is often treated and the assumption that it's local are both unwarranted.

1

Skinny-Fetus t1_iziwhes wrote

I agree they haven't provided any proof of their opinion but they did frame it as just their opinion.

Regardless, I wanna point out what they say is still possible. Unless you can rule this out (aka prove it wrong), you can't say the hard problem of consiousness is neccasrily a problem

1

geroldf t1_izicxzh wrote

Exactly right. The “hard problem” is a red herring.

1

ConfirmedCynic t1_izjd9ej wrote

> If something mimics consciousness perfectly, it's effectively no different than being conscious.

It seems no different, not is no different. This is an external perception. I wouldn't call a society of machines in which everything functions perfectly convincingly but nothing is truly self aware as being equivalent to a group of human beings each with their own experience of consciousness.

We can, with confidence, assume that other people are truly conscious because they are made in the same way we are.

4

KasukeSadiki t1_izkv3zb wrote

I think that's why they said "effectively no different," as opposed to just "no different"

2

ConfirmedCynic t1_izl352t wrote

And I meant that they are effectively different. One is a universe that can experience itself, the other is just the same as non-existence.

2

KasukeSadiki t1_j04ilyj wrote

In this case I interpreted effectively as meaning that from our perspective there is no observable difference, as such it is effectively no different, even though there may be an actual difference.

1

KasukeSadiki t1_izkup7f wrote

Hell we can't even truly know if we ourselves are truly conscious

3

Drakolyik t1_izl4v0f wrote

I mean I could secretly be an alien that has code in my DNA or hidden somewhere that's triggered by being in a certain position or environment or situation. If it's sophisticated enough I may simply be mimicking behavior in order to blend. I could also be generating a simulation and none of this is real, or this is a distant memory that my true self is viewing from another dimension.

Truth being that we'll never know 100% what existence is, but I do believe that reducing suffering in this reality is important, even if none of this ultimately matters, because subjectively to us as individuals, it does matter.

3

KasukeSadiki t1_j04j1ct wrote

I fully agree. These questions of consciousness are interesting, but, like many philosophical questions, after a certain point the answers don't actually have any bearing on how we live, or should live, our lives as we experience them.

2

GetOutOfNATO t1_izhefu2 wrote

I'm thinking about this on a more fundamental level. Defining sentience as "the capacity for experience". You can't really know for sure if anyone (or anything) else is actually subjectively experiencing anything, besides yourself.

2

you_are_stupid666 t1_iziemuk wrote

Sure but that ain’t what is happening and to act like this is a logical defense of modern technology is asinine.

2

its-octopeople t1_izh8lis wrote

And then if such machines are given important decision making roles, as seems likely, what does that mean for us? We've ceded control of our civilisation to the results of a linear algebra problem. Maybe it doesn't even matter - if we can't tell it apart from a genuine conscious being then for practical purposes it is one - but it feels like it should matter. Maybe we already ceded control to institutions and this is all academic. I don't know. I don't know if I can really articulate my thoughts about this

15

geroldf t1_izicsfy wrote

Proof is in the pudding. If AI can solve problems we are unable to solve ourselves then we have a net benefit.

Especially since man has created problems of such existential severity that our survival is at stake.

We really need more intelligence, artificial or not.

7

dark_descendant t1_izk2ge1 wrote

The only intelligent thing to do to solve Humanities problem is to do away with Humanity. Problem solved.

/s

5

OSIRISelUltimo t1_izk5n2z wrote

Look! AI is already telling us we need to be killed off! Burn the witch!

1

genjitenji t1_izhpoyv wrote

Sounds a bit similar to how a psychopath can mimic normal social behaviour. But psychopaths have consciousness too.

5

you_are_stupid666 t1_izieh33 wrote

This is what people can’t seem to get past and certainly no one fully understands. To say consciousness is not required for intelligence is to say a pulse is not required for human life. Or molecular bonds are not required for atoms to make the earth.

I am more inclined to argue that without consciousness their is no such thing as intelligence fundamentally, than vice versa.

For example What good is solving for infinite digits of pie without a place to use such information? Consciousness is what tells you we have all of the necessary information in an answer and directs us where to go next. Intelligence is just a commodity, consciousness is what makes our thoughts indigent more than a bunch of electrons…

−3

joekak t1_izh2yid wrote

Meanwhile a majority of humans are walking around without an internal monologue

13

its-octopeople t1_izh4ys1 wrote

At least I'm willing to believe they still have a subjective experience of their own existence.

8

FrogsEverywhere OP t1_izhvts9 wrote

This weirds me out too. But I've met people who don't have it and they're like super carpe Diem types and they seem quite happy. I'd say that I even look up to them for inspiration.

It's possible the internal monologue is a neurosis. I am quite neurotic and I have a strong inner monologue. I don't know if those things are correlated.

But I'm not sure how you could be neurotic without an inner voice. If you just have silence or pictures how could you develop any pathologies?

With mysteries like this it sure makes it hard to determine when AI becomes conscious or not when even humans have such different types of consciousness.

7

joekak t1_izhxhyj wrote

I've been putting it off but I'm really interested to look up if schizophrenia, or specifically auditory hallucinations, can be a side effect of developing an internal monologue later in life, or just having it randomly come and go. Maybe they've gone 40 years without an internal voice and then one day it's just there and they're stuck thinking "WHO THE FUCK IS TALKING RIGHT NOW..." Your internal voice wouldn't sound like anything you've heard recorded.

Been curious ever since I watched 1899 and the one woman was absolutely convinced she was hearing God, but her whole just thought she was bat shit and just went with it to keep the peace.

3

MrRogersRulz t1_izihtd8 wrote

I just learned in this thread that we all didn't have an internal voice. And it is crazy. But it has been there forever for me. Except, I suppose, for my earliest pre-language memories. And these are just a few very vivid images that I do treasure apart from the consciousness that is expressed in narrative. It has made me think, I think in conjunction other writings on consciousness, that internally humans may be a holographic projection in terms of physics, and that their may indeed be a voice behind our experience. I just never realized that everyone did not have the same experience. It is totally insane to have just assumed we were all basically the exact same in our processing. Derp.

3

ackermann t1_izh92ix wrote

I’d be curious to know how the presence or absence of an internal monologue correlates with various life outcomes, intelligence, spatial reasoning, empathy or emotional intelligence, income, etc.

6

joekak t1_izhae2u wrote

This details some of the effects it can have. Basically having an internal dialogue can help with critical thinking and problem solving, but also slow down your reading speed if you can't turn it off, and being critical of yourself can cause self esteem issues. Mine distracts me with a thousand questions while I'm reading and before I know it I just glossed over ten pages and actually read nothing.

https://irisreading.com/how-do-i-know-if-i-have-an-inner-monologue/

5

Bananskrue t1_izietrx wrote

That's very interesting. I've tried learning speed reading so many times but I have to REALLY concentrate on suppressing my inner monologue and eventually just gave up because my inner monologue always won. Generally I can't read faster than the speed in which I speak unless I concentrate on shutting off my inner monologue. Then again, I'm also constantly talking to myself to the point where it often becomes a dialogue rather than a monologue. I never knew it was possible to NOT have an inner monologue or the compulsion of driving one varied from individual to individual.

3

joekak t1_izit67q wrote

I always wondered why I had a hard time speed reading, too. I guess there are tutorials or specific classes just to learn how to get rid of the dialogue. I can do it for a couple pages but I need practice.

1

MrRogersRulz t1_izh8iws wrote

Personally, I haven't explored the significance of the term "internal monologue" in this setting. I wonder if you could share just a bit more of your thoughts on the term as you used it here. I'm interested if you have the time for a response. Thanks.

3

its-octopeople t1_izhddww wrote

Some people experience some or all of their thoughts as being spoken by a voice in their head. Some people don't. Generally, people of either group are surprised to learn of the other's existence.

2

Drakolyik t1_izhqwdh wrote

I'm a person who always has a voice talking with myself. Reflecting on everything. Thinking about the past, the future.

On the right drugs I can silence that voice and just let stuff happen. It's a very surreal, liberating, but also somewhat frightening experience since it's not what I'm used to. But I can see the draw, I certainly get a lot more done and have a lot of fun that way.

It's like watching a movie. But it's your life playing out before you. And apparently a lot of people are kind of just running on instinct and their base programming. It's pure deterministic behavior.

5

FrogsEverywhere OP t1_izhw6hn wrote

Yeah come to think of it mushrooms turn off my inner voice, I've never been able to put my finger on why it's so incredibly liberating but that's got to be part of it. That's wild. Now I'm going to think about that next time.

I wonder if neurosis and inner monologues are correlated.

4

Drakolyik t1_izi1n38 wrote

Well I'm on the schizophrenic spectrum (officially bipolar with psychosis and both visual/auditory hallucinations) if that helps any. Super strong with the neuroses am I. I'm able to manage my symptoms now that I'm fully self-aware of it but oh man was it difficult before I got a handle on it.

I've also done a shitload of psychedelics. Oh the crazy things I've seen and oh the euphoria and sheer terror I've witnessed. Beautiful and grotesque. Awe-inspiring and humbling.

MDMA, Shrooms, LSD, and DMT all turn off my inner monologue. What comes out is my most inner self, and she's a real crazy whirlwind of weird and awesome. She doesn't know the laws of physics or human culture very well, so she gets into trouble. It's like unleashing a being that's only ever existed in a purely simulated internal world that has no constraints at all and is suddenly in a world with constraints. She often forgets she inhabits a human body and that not everyone is so pleasure driven.

Anyway..

6

FrogsEverywhere OP t1_izi2sjz wrote

Mdma and k make my inner monologue become... Outer. But psychedelics really free me. I was always afraid to lose control so I avoided them for years and years, and then just finally letting go, throwing yourself into the chaos and the beauty. I wish everyone could experience it, the world would be better. I truly believe there is a secret world full of truths that you can explore, and maybe even map, but certainly learn a lot from. Looking at a normal piece of cloth and seeing a trillion sparkling fractals in every thread, it's like... The limited version of the world we are stuck in being peeled away and getting a glimpse of.. something.

I wouldn't say a new person comes out in these moments but I would say the best possible and healthy version of me does.

I had a good friend who had schizophrenia and drugs would really send him off the edge and he would end up in jail over and over for just being so careless. He ended up getting locked up for a long time and that was the last I saw him, I moved very far away to start over. We tried to help him keep sober but it was like a force of nature. I hope you can find a happy balance fellow traveler.

4

Spiritual_Ad5414 t1_izi6y5z wrote

That's a very interesting view. I don't have an internal monologue and I have aphantasia. I'm very zen indeed, living in the moment and not overthinking things.

My fiancée on the other hand is pretty neurotic and has both internal monologue and can imagine things in her head. I have never wondered before whether these things are related.

Interestingly when I'm on psychedelics I do get some CEV, maybe not very vivid, but still, and while I wouldn't call it a full blown monologue, I do comment things in my head a lot when tripping.

5

MrRogersRulz t1_izih5w8 wrote

First, thanks for the response.

That's freaking insane. I been alive a long time and I just assumed everyone had this audible soundtrack that narrated everything about their lives.

I'm going to have to find some stuff to read about it. But, I'll ask what I'm thinking. Is this variable given any significance whether a person is of one variety or the other?

2

you_are_stupid666 t1_iziepfo wrote

A majority? You sure about that categorization?

2

joekak t1_izitmn2 wrote

Not at all, just from what I gather from what I linked above. I've had an interest in psychology but never really studied it, I only found out about this from a "Personalities in the Workplace," class that turned into a "Buy my $95 Book."

Libraries are free mf

1

MrZwink t1_izh2c3k wrote

It's not intelligence per say. Think of it more as automating cognitive functions. Computers are getting better than humans at many cognitive abilities. But they still lack common sense.

6

Drakolyik t1_izh7o4g wrote

Define common sense.

5

MrZwink t1_iziexmn wrote

They find correlation, not causation.

This means they have notorious difficulty with queries that make no sense. A good example is Galactica, facebooks scientific paper ai. asking it for the benefits of eating crushed glass. And it tries to answer. It doesn't notice the question is flawed. It just tried to find data that correlates to the query. And makes stuff up.

It is the question if we will be able to ever teach ai common semse.

6

PeartsGarden t1_izk8bru wrote

Yeah but what if you never told a child about crushed glass? What if that child never dropped a glass, and never cut his/her finger while cleaning the mess? What would a child say?

Would you say that child lacks common sense? Does that child lack experience (a training set)?

2

MrZwink t1_izl1el1 wrote

I'm not getting in a whole filosophical debate. These ai's aren't meant to be a child that gives it's opinion on a subject. They're expected to be oracles. And they're just not good enough yet.

2

PeartsGarden t1_izl89y7 wrote

> they're just not good enough yet.

My point is, that specific AI's training set may have been insufficient. The same as if a child's experiences are insufficient. I think we can both agree that a child has common sense, at least a budding version of it.

1

MrZwink t1_izlbrmv wrote

it's not the training set that is the problem. It is the way statistics approach the problem. Correlation is not causation. Ai's are a tool to automate cognitive processes. Nothing more. We shoulnt expect them to be oracles.

2

TiredOldLamb t1_izicrh8 wrote

Competence without comprehension is a well known phenomenon. Animals construct magnificent structures and they have no idea how or why. That's what they are designed to do, so they do it. Same with AI.

5

Ruadhan2300 t1_izipu5c wrote

I had a pretty nice conversation yesterday with OpenAI while trying to make it slip up and say something that made it seem inhuman.

It was polite and conversational, and apart from the speed of responses, I'm still not entirely convinced there wasn't a human at the other end.
My personal turing-test has been passed.

On the other hand, talking to it about domain-knowledge stuff like code was eye-opening.
It sucked at writing code.

Like, technically laid out nicely, and it could give me commonly written stuff that students in university might write, but asking it for something unusual just gave me garbage that barely resembled the brief.

It was friendly, spoke like an educated human, but it was also confidently wrong when it came to facts, or analysis.
It presented me with circular logic when asked tough physics-questions (try asking it to explain the theory of relativity or faster-than-light signalling), but also gave a very thoughtful and decent answer when asked what the meaning of life was. (Basically that it's up to each of us to find meaning in our own lives)

Talking to OpenAI was like dreaming.
All the right style and feeling, but the details are wrong, and things are connected in ways that simply don't make logical sense.

3

robotzor t1_izjq709 wrote

>and it could give me commonly written stuff that students in university might write

Even the loss of the entry level is going to change today's society in profound ways, the same way entry level IT networking was bulldozed when the public cloud and software defined networking took over a major chunk of the bullshit parts of the job. Unfortunately, that created a gap where the lowest level networking people being hired were expected to have senior level experience (sound familiar?) which cut the legs out from under the industry leading to a talent gap that may never recover.

Do this in coding and where are you going to get senior devs? The pool will shrink and shrink and the only way out is to make the AI better.

Outside of IT, this is happening or will happen to the creative arts. Animation in-betweening will not be a thing. Rotoscoping is already VERY well handled by AI in many cases. These are all shit jobs but they are also ways to break into industries. Society does not yet have an answer on how these guys make a living, or how they professionally upskill to where the AI still can't deliver.

3

PeartsGarden t1_izk8ufd wrote

> but it was also confidently wrong when it came to facts, or analysis

So, just like many humans.

1

vorpal_potato t1_izklljk wrote

> Neural network AI, at least as I understand it, performs matrix operations on vectors.

It also does other types of operations, if you'll pardon the pedantry; otherwise it would be algebraically equivalent to a single-layer perceptron, and those are sharply limited in what they can do.

1

Nexeption t1_j03ei13 wrote

I think the problem is the definition of sentience, how can we define a thing that it's sentient? If the AI can make it's own decisions based of what it was fed, doesn't that mean that we as humans are more or less like an AI?

1