Viewing a single comment thread. View all comments

Tiamatium t1_jbdk7us wrote

Few things, depending on what you mean by "it". If you're talking about AGI, then I could come up with a small list actually:

  1. Funding and cost of AI in terms of work-results. If we realize that AI of intelligence of a mouse or a stupid dog can do everything and anything we need, and it's rather simple to create an AI like that, but it's a lot harder to create an AI of human level intelligence, there simply won't be any financial insensitive to create a smarter AI, and frankly, I see this as most likely possibility.

  2. Large scale military conflict in Eastern Asia, say if China invades Taiwan or North Korea invades South. Our chip manufacturing capabilities are concentrated in that one small reagion, and this is in a way Taiwan's insurance policy.

  3. Now this is the interesting stuff. It's perfectly possible that consciousness is more complex that we thing. There are few very well respected scientists that believe consciousness might be a result of weird quantum effects (in a way, a biological quantum computer), in which case our AI is further from AGI than most people thing. It's important to move that quantum effects emerge all the time in biochemistry, for example in the unholy union of physics, chemistry and biology known as Photosynthesis, where in each step of the process, from the moment energy is collected in antenna complex, it uses quantum effects.

2