Comments

You must log in or register to comment.

3SquirrelsinaCoat t1_jacoxz2 wrote

The difference in energy consumption is a big selling point, if theories turn into reality. It takes only 12 watts to power a human brain, which is jaw dropping efficiency, particularly compared to energy required for machine learning training. If energy efficiency is an inherent part of OI, this would be a huge step forward and possibly a viable platform for real AGI.

114

WackyTabbacy42069 t1_jae2lcb wrote

I mean, is it really considered artificial general intelligence at that point if we're just shoving neurons into a computer? Wouldn't it just be intelligence in general.

At the point of putting neurons into a computer, we've effectively created a new species of cyborg life. I see it as just being a new life form if it's based on living neurons

60

3SquirrelsinaCoat t1_jae3n3f wrote

Arguably, true AGI is a new life form, whether it is on silicon or meat. I don't believe that the current versions of machine learning will lead to AGI because of a few things but one of them is energy. If we get better energy efficiency (and maybe it scales, idk), then we can go full steam toward AGI because a huge hurdle is removed. But if we could somehow remove that hurdle and build AGI using our existing tools, I would still class it as closer to life than closer to machine. The autonomy of the thought and a real desire to exist (not a pretend one like what is farted out by the Puppet Known as ChatGPT) is evidence of life - but that's me.

29

rigidcumsock t1_jaeqawy wrote

I feel like you haven’t used ChatGPT or read up on it much if you think it purports in any way to be autonomously intelligent…

There’s zero “desire to exist”. It will tell you straight up it doesn’t feel or think, and is only a program that writes.

But go ahead and trash on a tool for not being a different tool I guess lmao

6

3SquirrelsinaCoat t1_jaer6kr wrote

I know exactly what it is. And I chose my words intentionally.

−9

rigidcumsock t1_jaerpb9 wrote

> The autonomy of the thought and a real desire to exist (not a pretend one like what is farted out by the Puppet Known as ChatGPT)

Then why are you claiming that ChatGPT pretends to have “autonomy of thought” or a “real desire to exist”? It’s just categorically incorrect.

10

3SquirrelsinaCoat t1_jaetk90 wrote

There have been plenty of demonstrations of that tool being steered into phrasing that is uniquely human. The NY Mag reporter or someone like that duped it into talking relentlessly about how it loved the reporter. Other examples are plentiful, ascribing a sense of self before the user because the user does not understand what they are using, for the most part.

There is a shared sentiment I've seen in the public dialogue, perhaps most famously by that google guy who was fired for saying he believed a generative chat tool was conscious (that was almost certainly chatgpt) - a narrative that something like chatgpt is on the verge of agi, or at least a direct path toward it. And while a data scientists or architects or whatever may look at it and think, yeah I can kind of see that if it becomes persistent and tailored, that's a kind of agi. The rest of the world thinks terminator, hal, whatever the fuck fiction. And because chatgpt has this tendency toward humanizing its outputs (which isn't its fault, that's the data it was trained on), there is an implied intellect and existence that the non-technical public perceives as real, and it's not real. It's a byproduct, a fart if you will, that results from other functions that are on their own valuable.

−9

rigidcumsock t1_jaeu0ye wrote

You’re waaaaay off base. Of course I can tell it to say anything— that’s what it does.

But if you ask it what it likes or how it feels etc it straight up tells you it doesn’t work like that.

It’s simply a language model tool and it will spell that out for you. I’m laughing so hard that you think it pretends to have any “sense of self” lmao

10

3SquirrelsinaCoat t1_jaeurbg wrote

>Of course I can tell it to say anything— that’s what it does.

No that's not what it does. I'm leaving this. I thought you had an understanding of things.

−7

rigidcumsock t1_jaeuwep wrote

I’m not the one claiming a language model AI pretends to have a sense of self or desire to exist, but sure. See yourself out of the convo lol

9

caman20 t1_jaczmp4 wrote

Great now we just won't get computer viruses anymore but have 2 worry about computer dementia. I can see Norton selling anti dementia software for the low price of $11.99 a month.

65

Infinite_Derp t1_jadnnxm wrote

You’ll get computer viruses, they’ll just also be transmissible to people.

26

caman20 t1_jado29b wrote

Oh God you're right. More fears 2 workout with my therapist.

9

greenappletree t1_jacmxmw wrote

Very cool stuff and can be used to screen for drug and discovery - the hope is reduce/eliminate animal models for certain things and at the same time increase ability mimick what is being studied. With that said it worries me about growing a brain organoid bigger and more complex and possibility of it becoming enthical is a possibility - we don’t need a bicomputer

46

quitepossiblesure t1_jacy1dy wrote

>you may live to see man-made horrors beyond your comprehension

30

driku12 t1_jadf2gf wrote

obligatory comment about the weakness of flesh and the certainty of steel

29

Gari_305 OP t1_jaci5t9 wrote

From the Article

>Brain organoids are a type of lab-grown cell-culture. Even though brain organoids aren’t ‘mini brains’, they share key aspects of brain function and structure such as neurons and other brain cells that are essential for cognitive functions like learning and memory. Also, whereas most cell cultures are flat, organoids have a three-dimensional structure. This increases the culture's cell density 1,000-fold, meaning that neurons can form many more connections.

Also from the Article

>OI’s promise goes beyond computing and into medicine. Thanks to a groundbreaking technique developed by Noble Laureates John Gurdon and Shinya Yamanaka, brain organoids can be produced from adult tissues. This means that scientists can develop personalized brain organoids from skin samples of patients suffering from neural disorders, such as Alzheimer’s disease. They can then run multiple tests to investigate how genetic factors, medicines, and toxins influence these conditions.
>
>“With OI, we could study the cognitive aspects of neurological conditions as well,” Hartung said. “For example, we could compare memory formation in organoids derived from healthy people and from Alzheimer’s patients, and try to repair relative deficits. We could also use OI to test whether certain substances, such as pesticides, cause memory or learning problems.”

23

----Zenith---- t1_jacse9m wrote

I genuinely don’t get why humans even WANT to try doing this shit after all the movies we’ve seen.

How does it not end in catastrophe?

16

Thatingles t1_jad0l6c wrote

If aliens landed on earth and gave us a big, shiny red button marked 'Do not press. Ever' and then departed without explanation, I am super confident that we would press the button.

20

yohohoanabottleofrum t1_jad5vmn wrote

My God, can you imagine the wars, and political posturing over the button. Forget laser cannons and death stars, that's all they'd have to do to kill us...

11

Grwwwvy t1_jae9y4c wrote

If Jimmy Neutron gets to have a dog, then I get to have friends.

Also Asimovian robots are way more chill than most people I know.

1

Beyobi t1_jacue75 wrote

This technology is nothing new. Here's an old video on how they make these chips, and how incredible the possibilities are.
Rat neurons fly simulated drone

14

Accelerator231 t1_jad15n5 wrote

I wonder how it's even taught.

You can train rats with food and heroin. How'd you punish or reward a bunch of nerves? How'd you even be sure it can interpret data correctly?

4

Beyobi t1_jad5ntc wrote

I'm guessing the nerves are grown in some kind of logic gate orientation and that is how it can be used in digital circuitry. On or off. Off or on. That's it's purpose. To flip the switch on or off. No reward, no punishment. Only duty.

9

True_Sell_3850 t1_jadmmwk wrote

So neurons just crave stimulation, they don’t particularly care what stimulation it is. I believe how they train them is they try to get it to do something, and if it doesn’t do it they put it into a dark room with zero stimulation for a period of time then take it out. I think they combine that with the typical teaching of neural networks.

3

Enzo-chan t1_jadk9no wrote

Braincells in the chip: kill me please!

14

MattDLR t1_jadklo0 wrote

Between this and the chat ai can we please stop trying to create an evil overlord AI

12

Impressive-Ad6400 t1_jadqwco wrote

-What's that smell?

-My computer died.

-Oh, God.

11

SixteenthRiver06 t1_jadvvsk wrote

Shit, y’all think this is cool, should check out Servitors in Warhammer 40k. It’s not lab-grown though, more like punishment for criminals or heretics. Same concept haha

10

LuneBlu t1_jad2adf wrote

In no way this can backfire... Can it? From this, to toying with the idea of reflecting sunlight to lower climate warming, we are playing with ideas with potentially serious implications and limited understanding.

9

greenmachine11235 t1_jad8ud0 wrote

There's always a risk in new tech. There was a risk developing the internal combustion engine (see climate issues), there was a risk in developing the computer chip (see guided weaponry), and others but just those two examples fundamentally altered human society for the better. The argument that new tech has risks so it should not be explored is stupid, without new tech humanity stagnate with no hope of solving the problems facing the world today.

4

OriginalCompetitive t1_jadojse wrote

It’s possible to think tech in general is good while also believing certain specific technologies should be avoided. Nobody thinks Nazi experiments on human eugenics was a good idea, for example.

2

Coreadrin t1_jadrvcq wrote

This is the tack I would have taken if I had written the original matrix movie.

Humans as batteries is ridiculous. Humans as a hijacked complex quantum neural network? Hell yeah.

9

Treat_Street1993 t1_jactel9 wrote

Well heck, why not. Artificial intelligence programs running on organic computers. I know we said we weren't going to make replicants, but heck, they're just so darn neat is all. I mean yeah of course we'll program them feel pain and fear, why wouldn't we? It just wouldn't be an authentic personality otherwise. Oh and yes absolutely we're working on replicating psychosis in an AI. Heck, maybe we can even just throw the code for that into the next mandatory automatic secret update, you know that would totally make them even more authenticl. No, I don't think we really have time to test it, we have deadlines to meet. Future's looking bright!

6

fapalicius t1_jadeb7m wrote

Terminator will be more like us than i imagined

6

ramdom-ink t1_jadb7cu wrote

Combined with DNA data storage, we are definitely heading towards the Singularity. The concept of the universe as a simulation is even more plausible, month by month…

5

fapalicius t1_jade808 wrote

How many levels is it deep do you think? Like simulation in a simulation in a simulation... Maybe we're just creating a new level

2

ramdom-ink t1_jads5pi wrote

Already there…time moves in circles, nonlinear.

1

MyCleverNewName t1_jadkv46 wrote

Very interesting and exciting stuff and not existentially terrifying at all! 😬

5

daveprogrammer t1_jadp3xr wrote

This was newly-implemented technology on Star Trek: Voyager.

5

criticalpwnage t1_jadzkmo wrote

This isn’t horrifying at all. Nope.
^This ^lengthened ^comment ^brought ^to ^you ^by ^this ^subreddits ^rules

5

UtCanisACorio t1_jaek91s wrote

cool so they'll have no idea whether there's a consciousness there and/or whether it is in constant, excruciating pain.

4

SCZ- t1_jad0tif wrote

So... Would I need to feed my computer for it to work? Can I accidently starve my computer to death? Can it get infected with ACTUAL viruses? So many questions

3

VoodooPizzaman1337 t1_jadm0f3 wrote

I got a great idea that going to make we all rich!

It going to involve some human trafficking and a tech priest though.

3

Inspirata1223 t1_jadrusl wrote

"In the grim darkness of the far future there is only war" In all seriousness this is pretty cool technology.

3

buggin_at_work t1_jadiedf wrote

I feel like there are some serious ethics concerns with this

2

LifeEnvironment1377 t1_jadj3kn wrote

Is Fallout just predicting the future? Robobrains are finally coming to life.

2

Obvious_Zombie_279 t1_jadp8c2 wrote

Coincidentally, they’ve also unveiled an amazing, new nutritious food supplement called Soylent Green.

2

T-Money8227 t1_jadpxra wrote

Great! Just what we need. Computers with mental health issues.

2

Abedsbrother t1_jae2umm wrote

Getting major Deus Ex vibes from this. Only played the Jensen games, thinking of Hyron from Human Revolution.

2

FuturologyBot t1_jaclt7s wrote

The following submission statement was provided by /u/Gari_305:


From the Article

>Brain organoids are a type of lab-grown cell-culture. Even though brain organoids aren’t ‘mini brains’, they share key aspects of brain function and structure such as neurons and other brain cells that are essential for cognitive functions like learning and memory. Also, whereas most cell cultures are flat, organoids have a three-dimensional structure. This increases the culture's cell density 1,000-fold, meaning that neurons can form many more connections.

Also from the Article

>OI’s promise goes beyond computing and into medicine. Thanks to a groundbreaking technique developed by Noble Laureates John Gurdon and Shinya Yamanaka, brain organoids can be produced from adult tissues. This means that scientists can develop personalized brain organoids from skin samples of patients suffering from neural disorders, such as Alzheimer’s disease. They can then run multiple tests to investigate how genetic factors, medicines, and toxins influence these conditions.
>
>“With OI, we could study the cognitive aspects of neurological conditions as well,” Hartung said. “For example, we could compare memory formation in organoids derived from healthy people and from Alzheimer’s patients, and try to repair relative deficits. We could also use OI to test whether certain substances, such as pesticides, cause memory or learning problems.”


Please reply to OP's comment here: https://old.reddit.com/r/Futurology/comments/11e5xk0/scientists_unveil_plan_to_create_biocomputers/jaci5t9/

1

TommyTuttle t1_jadzdmi wrote

You know people have long been dreaming of doing this with real actual brains of people whose bodies have given out, right?

1

Exigency_ t1_jae132i wrote

Can't wait to have my computer beg me to kill it every morning.

1

frequenttimetraveler t1_jae8dyo wrote

> Despite AI’s impressive track record, its computational power pales in comparison with a human brain.

This is not true , no human can do what an 1TB model can do. There doesnt seem to be any limit in sight to scaling and extending AI models, as opposed to humans and other brains.

Organoids don't have anatomy, layering and connectivity, which our brain does. a giant unstructured clump of neurons is not necessarily smarter (like elephants or dolphins)

> Dr Brett Kagan of the Cortical Labs

This team used an organoid to 'learn' to play the game of Pong last year, but there is a lot left to be desired in that paper. The extent to which it 'learned' (rather than adapted slightly) is debateable, as are the consequences of their findings

1

reallyrich999 t1_jae9s18 wrote

Cool and weird at the same time I think I’ll call it Coird

1

WildGrem7 t1_jaebueh wrote

Does this mean I won’t have to buy a video card for 1k+ to game?

1

RazzDaNinja t1_jaesjdu wrote

We are one step closer to real life Servitors. Praise be the Omnissiah

1

stalinmalone68 t1_jaey95s wrote

I’m excited by this. I see no possible way this could go horribly wrong in a sci-fi horror kind of way.

1

reliable_specs t1_jaf2hut wrote

It would be truly remarkable if we could make biocomputers powered by human brain cells. The potential applications of this technology are vast, but what will be the source of the human brain cells used in creating the organoids?

1