Ok_Sea_6214

Ok_Sea_6214 t1_j998w9w wrote

2015: "AI won't beat top human players at Go for another decade."

2017: "AI won't beat top human players at Dota for another decade."

2019: "AI won't beat top human players at Starcraft for another decade."

2020: "AI won't be anything close to general intelligence for another decade."

2022: "It looks like general intelligence but that's not real intelligence, that'll take another decade."

ASI already exists, we're just being slowly made aware as not to cause a panic.

1

Ok_Sea_6214 t1_j8l6p7s wrote

All fighter aircraft will be fully autonomous by 2024 if there's a peer conflict. The real challenge isn't the technology but a risk and drone averse culture, so if say China or Russia decides they're at too great of a disadvantage and have little to lose from risking making all their jets unmanned, then this would force everyone else to adapt as well. Unquestioning robots that don't care about losses certainly are to the liking of totalitarian style regimes.

It's like in WW2 when the US rejected the idea of torpedo bombers to take out ships because they were convinced they wouldn't work (when their own testing proved that they would). It's not until the British and Japanese destroyed entire fleets that they adapted.

But now wars move so fast that there might not be enough time to adapt, the next Pearl Harbor might be called Washington.

On the technical side it's also not that hard, even the US has been flying unmanned jets, including F-16s, for decades. It's mostly a matter of developing the AI needed to manage them in a heavy jamming environment, but seeing as missiles and drones are already smart enough to execute complex missions to a certain level, it should be enough to automate all combat air vehicles to a high degree that they can defeat an enemy.

This can be very simple, such as China using mechanical autopilots to swarm Taiwan with unmanned J-5 and J-6, even with zero guidance and a high failure rate their simple presence will force Taiwan to shoot them down, or risk them crashing down filled with fuel and explosives, like modern day V-1 flying bombs, but probably accurate enough to hit an airfield or staging area.

1

Ok_Sea_6214 t1_j6dmerx wrote

Reply to comment by pandoras_sphere in I’m ready by CassidyHouse

There's only so much need for individual consciousnesses. Every upload needs to warrant the cost of storage and operation, no matter how insignificant.

And I think a shared biological/digital consciousness is the way to go to transfer legacy humans to digital ones, probably with cyborg upgrades. You can use an old drive as a backup until it fails, and just upgrade to a new one when it does.

1

Ok_Sea_6214 t1_j6bu0mw wrote

The problem is natural selection: you can't introduce this level of technology and expect we'll just all get to enjoy it without any issues. Industrialization led to two World Wars, and nuclear technology led to nuclear bombs, which could still destroy us all before we get to the Singularity.

It's why I believe 90% of people will not survive long enough to see this happen, because it would be too easy.

1

Ok_Sea_6214 t1_j62jcyb wrote

I, Robot was released in 2004 and is set in 2030.

On Imdb I once read a review from 2010 of someone complaining that this was an unrealistic timeline. Today it seems pretty likely.

Mind you the person didn't mention the fact that much of the technology in the movie other than robotics, such as smartphones, was already inferior to what they had even in 2010.

5

Ok_Sea_6214 t1_j62j4i4 wrote

I predicted all this up to a decade ago, down to the year, back then people said I was crazy.

Now when I point out that I was right, they go "oh ok Nostradamus, then tell us what happens next".

I tell them, and once again they say I'm crazy, and they don't see what's wrong with that.

2

Ok_Sea_6214 t1_iwzbxj0 wrote

Full on mind reading technology will soon be as normal as a smartphone. Which will be great because it'll allow for a society where crime becomes next to impossible.

People fear this will be abused by those in power, but they forgot that in modern society lying is their single greatest power. 99% of people are not part of the lies, and only comply because they have little other choice, but if the lies at the top are exposed, that masses will no longer go along with it.

Which is why the elites fear technological progress more than anyone else. It's why in the past many powerful groups (such as kings and religions) rejected progress, because they understood it threatened their power.

Information is power, if the elites were forced to share it with the public, then the 99% would be able to dominate the information market. Or if only one person out of 100 people has a gun, then that 1 person has a lot of control over the other 99. But if everyone had a gun, then everyone would truly have an equal amount of power, and then you get true democracy.

And it's for this reason that as we are on the cusp of the Singularity (and beyond it already, but few think beyond what they can see), the risk to the 99% becomes catastrophic, as the 1% will do anything it can to maintain their grip on power.

2

Ok_Sea_6214 t1_ivhst2a wrote

Quite the contrary, AI would free the 99% from the control of the 1%. That's because its superior intelligence would mean it'll easily take power from the elites, and share it equally among the masses. Basically communism, but if it worked.

The incredible danger lies in the reaction of the 1%, they're not going to go down easy. They have a small window between when technology gives them huge power and makes them powerless, and it's in that window that they'll try some really crazy stuff, like killing the 99%.

1