wisintel

wisintel t1_japj0au wrote

How do you, this lady writing about octopuses or anyone else “know” that. No one knows how consciousness works. No one really understands how LLMs convert training data into answers. So how can anyone say so definitively what is or isn’t happening. I understand different people have different opinions and some people believe that chatgpt is just a stochastic parrot. I can accept anyone having this opinion, I get frustrated when people state this opinion as fact. The fact is no one knows for sure at the moment.

9

wisintel t1_japg1ua wrote

The whole premise is flawed. The Octopus learned English, and while it may not have the embodied experience of being a human, if it understands concepts it can infer. Everytime I read a book, through nothing but language I “experience” an incredible range of things I have never done physically. Yes the AI is trained to predict the next word, but how is everyone so sure the AI isn’t eventually able to infer meaning and concepts from that training?

12

wisintel t1_j7wrbjc wrote

What’s wrong with being passionate and excited about the future. Even if your wrong, what people believe or don’t believe on this forum has zero impact on the real world. For me it’s like buying a lottery ticket. It’s highly unlikely I’ll win, but I am paying for the time I get to spend imagining what it would be like if I did win.

12

wisintel t1_j21exke wrote

I think initially full dive just needs something like neuralink to connect into the nerves that send signals to the brain. So connect to optical nerve, auditory nerve etc and the replace the signals being sent from those sense organs. I don’t think the bitrate of those nerve are terribly high. I can’t imagine how we get full brain full dive like nervegear.

3