Viewing a single comment thread. View all comments

69inthe619 t1_j9flqfe wrote

2

khantwigs t1_j9fnnyx wrote

yea which you cant technically differentiate so

1

69inthe619 t1_j9fp3k4 wrote

technically, differentiating is guaranteed, the machine was programmed to react on sensors so there is either 1. code or 2. an algorithm. a machine does not take tangibles, 0’s and 1’s, and create intangibles, like depression. one of these things is not like the other.

0

dragonblade_94 t1_j9g02dl wrote

I think you need to clarify exactly how you describe feeling, as it seems like you keep swapping between physical pain and emotion, or at least treat one as if it requires the presence of the other.

I would also like to ask how exactly you would define a 'machine.' If we could recreate a human cell-by-cell, including full creative liberty on how the brain is built and functioned, would it be a machine? Would this humunculous have 'feeling', despite being functionally programmed by it's creators? Is the qualification of a machine that it must be built with inorganic materials?

1

69inthe619 t1_j9g8r93 wrote

you can feel pain, a physical sensation. you can feel sad, an emotion. not interchangeable, but they are both things you feel. one is tangible, one is not. a machine, or aI network, can not feel pain, it can describe pain. sensors and programming/algorithms acting as if they are nerves does not equal nerves. and believe it or not, the ai also can not catch the flu and feel like it is nauseous because it doesn’t have any parts that can be nauseous, nor does it have any living cells for the virus to infect thereby presenting the opportunity to make it nauseous. can AI say it is sad about something? of course. but can it be clinically depressed, no. AI chooses based on inputs/outputs, we do not choose to feel what we feel, that decision was made for us, not by us. and if you could build a human cell by cell, yes, that would be a “machine”, a biological machine, and that would make you God which would beg the question, wtf are you doing on reddit bruh?!

1

dragonblade_94 t1_j9gbwsi wrote

Your argument is honestly just a half-dozen ways of rephrasing "machines cannot feel" without really positing why any of your points lead to this assumption.

>you can feel pain, a physical sensation. you can feel sad, an emotion. not interchangeable, but they are both things you feel. one is tangible, one is not.

I feel like the 'tangibility' question is vague to the point of being moot in this context. Both being stabbed and grieving over a loss are the brain processing signals caused by stimuli. I would definitely classify one as largely more complex of a brain activity than the other, but I don't believe there's an inherent difference other than which part of the brain is doing the processing. I can see a philisophical argument being made over their separation, but you would need to go a lot more in depth to explain why exactly the difference is valid.

> sensors and programming/algorithms acting as if they are nerves does not equal nerves

Why? Legitimately, why would an apparatus that does the exact same job as nerves not be equivalent other than material makeup? Given your response to the homunculus example, I have to assume you don't consider the choice of materials to be important to the question, so I'm curious as to your reasoning.

> the ai also can not catch the flu and feel like it is nauseous because it doesn’t have any parts that can be nauseous, nor does it have any living cells for the virus to infect thereby presenting the opportunity to make it nauseous

I'm not sure I see the point here. Commenting on robotics being resistant to disease doesn't really mean much in a discussion about sentience.

>AI chooses based on inputs/outputs, we do not choose to feel what we feel, that decision was made for us, not by us.

If looking through a purely deterministic perspective, this is exactly how humans operate as well; everything we think and feel is caused by chemical reactions and stimuli bound by the laws of physics. But that doesn't prevent us from feeling emotion, it just implies that feeling was inevitable.

>and if you could build a human cell by cell, yes, that would be a “machine"

If i'm interpreting your argument correctly, is your view that the existence of an intentioned creator is what defines an 'unfeeling' being from one that feels?

1

69inthe619 t1_j9gwo6j wrote

extraordinary claims require extraordinary evidence, the burden is on you to provide any evidence whatsoever that the only requirement for being able to feel and express emotions, both tangible and intangible, love or pain or both simultaneously, is a sensor and raw data. by your logic, the only thing the world needs to permanently solve all depression everywhere is raw data on happy because we already have senses, and your logic says those are exactly equal to sensors. e=mc2. if you can turn mass into energy, the opposite is also true, you can turn energy into mass. that is what “=“ means.

1

dragonblade_94 t1_j9h51sq wrote

You're shoving a lot of words in my mouth.

>the burden is on you to provide any evidence whatsoever that the only requirement for being able to feel and express emotions, both tangible and intangible, love or pain or both simultaneously, is a sensor and raw data.

I'm not here to play burden tennis with you, especially not in a heavily philisophical debate based on theoretical technology. Nor did I ever list out the requirements of emotion being "a sensor and raw data." The basis of my position is the idea that there is nothing inherent and exclusive about the human body that requires natural reproduction to be made manifest. This was the intent behind my homunculus example; I want to know what you consider to be the definining difference, whether it be the existence of soul, a creator, free-will, whatever.

>by your logic, the only thing the world needs to permanently solve all depression everywhere is raw data on happy because we already have senses

From a causal deterministic standpoint, yes it would be theoretically possible to control a person's emotions using controlled stimuli. This idea falls adjacent to Laplace's Demon, a concept that describes a theoretical entity that knows the state of every atom in the universe, and can therefor perfectly tell the future. Such an entity could in theory use the same knowledge to make adjustments in a way that changes a person's thought process.

The problem here is twofold; first we simply don't have the level of technology needed to fully map and understand the human brain structure. In order to affect the brain for a precise outcome, we need to know exactly how it works down to the smallest scope. Second, we would need a computer or other entity capable of storing information on and simulating not only a given brain, but every possible interaction and stimuli that could affect it (essentially the universe). Outside of some fantastical technological revelation, this makes perfect prediction and control virtually impossible.

What we can do though is crudely alter the chemicals in the brain & body to simulate different states of mind. Medications such as anti-depressants contain chemicals that, when received by the receptors in your brain, forcibly shift your brain function. Chemicals and electrical impulses would be our equivalent to internal 'data.'

>and your logic says those are exactly equal to sensors

Again, never said they were exactly equal, but rather they can be created to serve the same purpose. I wouldn't even consider this controversial; the existence of bionic eyes or cochlear implants, allowing blind and deaf people to see and hear, grounds this in present reality.

1

69inthe619 t1_j9ix0tn wrote

this is not a philosophical, scientific, or even a debate if you consider organic and inorganic matter to be all the same. if it was the same, there would not be a difference, but there is. by refusing to acknowledge that there is anything special about life which we are yet to find anywhere else in the universe, you have oversimplified to the point where you are trying to recreate the mona lisa by finger painting. a $50 bag of inorganic matter from an abandoned radio shack clearance bin can not be magically converted by an inorganic ai to have the necessary qualities of organic matter while remaining inorganic and spontaneously producing feeling, emotions, and lets not forget the leap you take by jumping from the simple input of the data to generating sentience. are you going to say the only thing sentience requires is the same data that emotion requires? if you think that, congratulations, you know absolutely nothing about quantum physics where everything exists as only a probability wave. it is there, where quantum probability intersects with the organic matter of your brain in the physical world, where sentience is possible. sentience is not something that came to be just for fun, life does not expend energy making unnecessary things like five legged humans, it evolves what it needs to survive out of sheer do or die necessity. and sentience is absolutely essential in order to navigate a universe where at the foundation, there is only a probability of something, and there is also a probability it is not. you better be able to handle any outcome simultaneously so you can do basic things, like make a split second decision that will decide if you escape certain death and live to reproduce, or die. the only thing that life can not afford, death before reproduction. in that context, sentience is the only thing separating successful reproduction and the continuation of the cycle of life and death before reproduction and the end of life. but hey, sentience will just happen if you read enough books, right?

and no, a quantum computer does not recreate the quantum mechanics with organic matter in your brain so that is nit a save for you. also no, quantum computing is not the answer to every problem everywhere because there is an entire set of mathematical problems that do not have a single answer because there is an “infinite” number of pathways to get there so computing the answer is not even possible to begin with (ie: the traveling salesman). infinite makes computers compute infinitely, and that means no soup for you or your ai. this doesn’t take into account all of the equations and constants baked into the universe that prevent the universe from being computable in any practical way. think pi. 3.14…. an infinite number that repeats infinitely across the universe in every single sphere in existence. and gosh darn it, there is that infinite thing that makes computers compute infinitely. now you have to start all over, again.

if acting like it and being it are no different, then han solo is a living breathing actual han solo thinking human being and not harrison ford because i have seen the star wars movies and he is acting like it is real so by golly, it is just as real as reality just like the ai acting like it has feelings is just as real as having feelings. who taps out in a pain endurance contest between you and your ai that read about pain in a book so it can provide a pained response to a hypothetical pain? i imagine you have a .001% chance of outlasting the ai, but not because of you, or even the ai, you get that .001% because the quantum element to this universe makes it impossible for there to ever be a 100% certainty of any future outcome. but hey, at least you have sentience, just like that bag of scrap from radio shack (once you get around to watering it with data).

1