Viewing a single comment thread. View all comments

Cult_of_Chad t1_ir1kzmo wrote

How do you know and why is your certainty so high?

1

wind_dude t1_ir1mymg wrote

It's part of my job to know. As a software engineer working on information extraction i'm constantly evaluating SOTA ml/ai, and reading research papers. For example if you look at LLMs like OpenAIs GPT-3 it just predicts the next token in a string. It still falls flat on it's face if you offer it multiplication. Narrow ai and specialised domain specific NLP still does better than a lot of these LLMs for many tasks like NER and classification. Albeit at the cost of annotation, but the benefit of performance and accuracy.

The other models, like protein folding, dna sequencing are narrow AI. It's very likely AI models will help solve fusion, quantum theory etc, but they will be specialised and narrow AI models created and worked on in collaboration with domain experts.

2

Cult_of_Chad t1_ir1rnq7 wrote

Humans are not very good at multiplication. Hell, I'm completely innumerate due to a learning disability yet I don't think anyone that's ever met me would call me stupid. Do I not possess general intelligence?

Even an imperfect brain such as mine would do wonders if I could think a million times faster, correlate all the contents of my mind and have high bandwidth access to constant cognitive updates and all existing knowledge...

2