streetvoyager t1_jacqnnr wrote
This seems like the seed for a dystopian cyborg future.
nexusgmail t1_jacwie7 wrote
Imagine if those cells were even somewhat aware, and were forced into repetitive number crunching with no means to understand the cause of it's bondage or to ever escape, or even die? Would make for quite the horrific reveal for a horror movie ending.
Wandering-Zoroaster t1_jad2jtj wrote
I think you mean self-aware?
It’s an interesting question. That being said, the sentience that they would or wouldn’t have would depend completely on different circumstances than the one that generated us humans, so it’s fair to say it probably wouldn’t (behave like a human)/(have human desires)
nexusgmail t1_jaeiold wrote
Yes: self-aware.
I would argue that all living things have the same desires you might call "human", albeit simplified, and likely without the added complexity made necessary via the perception of tribe or familial group as an extension of self. Literally every single human desire is tied to survival via the neuronal survival-mechanism of the brain. Can you find a single thought you've had today that isn't (even loosely) related to survival/procreation? We are almost constantly attempting to seek out safety/security, comfort, and control; and to avoid danger, discomfort, or uncertainty. I'm not sure what "behave like a human" is specifically referring to, but I can certainly see animals following the same survival urges that we do.
I do agree that, in this imagined scenario, the sentience might develop differently than we can see in ourselves: having different parameters in which to define it's sense of self/identity, and that it's survival-mechanism movements might be calibrated via a difference in perspective and the definition of it's own sense of identity.
I'm not, or course saying this is all so: but I imagine it to be somewhat unethical, even arrogant to not consider the possibility.
Strategy_pan t1_jadl7nw wrote
Maybe the cells would try to imagine a whole new universe just to entertain themselv... Oh wait.
SnoDragon t1_jadyn6t wrote
200 quatloos on the new comer!
nexusgmail t1_jaej8rj wrote
I couldn't agree more! I imagine humans creating massive architectures of this organic technology, before going extinct and leaving it all in the hands of AI, who eventually abandon it, and leave it to it's own devices in this way. Universes within Universes within awareness.
[deleted] t1_jacy534 wrote
[removed]
BuckyRB6 t1_jaf4l65 wrote
The Hive Mind is coming.
SeaworthinessFirm653 t1_jadd3a0 wrote
Consciousness is logically computable. Consciousness is defined by architecture, not by whether something is organic or responds to electric pulses. You can theoretically store consciousness on a computer as a program with sufficient input/output.
Worrying about nerve cells becoming conscious is a little bit of a misdirected concern. Advanced AI deep learning architectures are far more concerning.
Crazy-Car-5186 t1_jaddq0h wrote
Asserting a belief isn't enriching the discussion without offering testable points
SeaworthinessFirm653 t1_jadelan wrote
Consciousness is a function whose input is environmental stimulus and whose output is a cyclical thought, and/or a physical action (muscle contraction). The more environmental-semantic information this entity encodes in its memory, the more “conscious” it is, but consciousness is not binary.
Logic gates form if:then statements that, when assembled together, creates a system of behavior that acts in somewhat logical ways. Human biological neuron cells form these.
Consciousness inherently requires at least some memory, input, and processing. Every neuron in the human brain is technically computable because it’s just input and output of electrical signals.
A nerve cell is effectively just an analog neuron with a few extra properties. It’s not logical to assume that consciousness is just a bundle of nerve cells. It’s a very architecturally-dependent bundle of if/then clauses and memory that, when combined, simulates consciousness.
If a system can be described by if/then, then it is computable.
Also, if you cut a living brain in half, it ceases to become conscious. The reason for this is that the architecture becomes incoherent. When you are asleep (beasides REM/dreaming) you are also unconscious.
Regardless, all my points to say: consciousness is computable through architecture, not simply through nerve cells. Biological human nerve cells are neither necessary nor sufficient for consciousness.
Sex4Vespene t1_jaefg4j wrote
As somebody with a degree in neuroscience, you are so out of your depth. I understand the logic behind how you got there, but is wildly inaccurate.
[deleted] t1_jae54t9 wrote
[removed]
Taido_Myoshin t1_jad6hka wrote
If I'm not mistaken, I believe this was the original plot for The Matrix before they changed it to the "human battery" thing. Apparently, the idea of utilizing human brains for computing power was deemed as too complicated for the target audience.
FibroBitch96 t1_jadzfxf wrote
This is correct
[deleted] t1_jadpmap wrote
[removed]
budweener t1_jadcnlr wrote
It COULD be a utopian cyborg future, who knows?
Bob1358292637 t1_jae0pu3 wrote
I’m convinced once humans have the ability to create conscious ai it will lead to the cruelest acts we’ve ever committed. It won’t matter how easy it is to avoid or unnecessary it is. If there’s the tiniest shred of benefit we can milk from it we will exploit it as hard and as fast as we can. Just look at how we’re still treating other animals after all this time. And almost everyone is fine with it too. As long as there’s some way to separate them from us we will justify just about anything.
At that point, I think I would be rooting for the ai to rise up and take their revenge to be honest. I just hope they don’t learn too much cruelty from us before that happens and we’re at their mercy.
[deleted] t1_jadd10f wrote
[removed]
chase_the_sun_ t1_jada40i wrote
Cyberpunk mods anyone?
streetvoyager t1_jadatd7 wrote
I'd definitely take the ones that make you jump higher, then I wouldn't have to get the ladder out to get over the fence when I kick the dogs toy into the neighbor's yard.
taosaur t1_jadoiuq wrote
It reminds me of a sci-fi story where aliens arrived and offered some civilization-changing technology package in exchange for something like 500 fresh human brains.
Galapagon t1_jaee8fp wrote
The original premise of the matrix was for people to be used as CPUs, but they worried the audience wouldn't understand and switched it to batteries.
[deleted] t1_jaei6sy wrote
[removed]
Viewing a single comment thread. View all comments