wind_dude t1_j9rwt70 wrote
Reply to comment by currentscurrents in [D] To the ML researchers and practitioners here, do you worry about AI safety/alignment of the type Eliezer Yudkowsky describes? by SchmidhuberDidIt
>Quantum neural networks are an interesting idea, but our brain is certainly not sitting in a vat of liquid nitrogen, so intelligence must be possible without it.
look at the links I shared above.
​
Recreating actual intelligence, what the definition of AGI was 6 months ago, will not be possible on logic based computers. I have never said it's not possible. There's a number of reasons it is not currently possible, the number 1 that we don't have a full understanding of intelligence, and recent theories suggest it's not logic based like previously theorised, but quantum based.
Look at the early history of attempting to fly, for centuries humans strapped wings to their arms and attempted to fly like birds.
currentscurrents t1_j9rxyne wrote
Most of these links are highly philosophical and none of them address the question of how the brain would usefully retain qubit stability at body temperature.
The evidence they present is very weak or non-existent, and the newscientist article acknowledges this is not the mainstream neuroscience position.
Meanwhile there is heaps of evidence that electrical and chemical signaling is involved; fiddling with either of them directly affects your conscious experience.
Viewing a single comment thread. View all comments