Submitted by fignewtgingrich t3_10e6r0d in singularity
[removed]
Submitted by fignewtgingrich t3_10e6r0d in singularity
[removed]
> Maybe from technological standpoint but it will take time for humans to adopt it.
Look at what happened with ChatGPT. There is no adoption on-ramp, it was released to the world and now educational institutes are scrabbling to play catch up.
You can bet if a headline reads 'china cures [x]' where x is anything from aging to cancer to any much sort after medical treatment, that timelines will be shortened due to public pressure.
And it will be china and countries like it that cure xyz diseases.. Because their regulations aren't quite so... Dumb..
I've heard plenty said about the societal level benefits of a one party rule country, e.g. being able to plan ahead without fear that it will get stopped or defunded when an apposing party comes to power. This has allowed for a lot more progress in terms of planning and infrastructure than otherwise would.
However the downside of such a thing is that there is lack of care to the individual, and at some level, ends justifying means.
The rules and guides for a lot of safety measures are written in blood, ways to make sure that dire mistakes can never happen again.
I feel there was a very real benefit to this at the speed everything moves at, drugs can be proscribed along with a verbose list of side effects and cofounding medication.
I do also feel that rules and guidelines need to be updated to reflect reality. e.g. as drug simulation becomes better it should be relied on more, I feel regulations should alter in lock step with how easy it is to verify drugs in silico.
I don't like china, just fyi.
>The rules and guides for a lot of safety measures are written in blood, ways to make sure that dire mistakes can never happen again.
Admittedly. But when you get an experimental vaccine In less than a year, but at the same time, have to wait 10+ years to access new cancer therapies (even though cancer will kill you) it upsets me, primarily due to the inability for normal people to make their own decisions and take their own risks.
>I do also feel that rules and guidelines need to be updated to reflect reality.
Often, what happens instead is that the rules and guidelines are set up to dictate reality. Immunotherapy is a fine example, it's original founders were colored as quacks, and now it has become one of the most groundbreaking developments in cancer treatment.
Also, laws are very very rarely repealed or removed.. There are still laws that say you can't have a pie cooling on your windowsil in order to prevent attracting bears... Even though the bears in said location were eradicated lifetimes ago.. Adding sunset clauses to laws would be a great first step. Make it so all laws need to be renewed after a set amount of time
I keep hearing this "AI will take long to blend into civilisation"
I don't buy it. We already have capitalist financial markets. If an AI driven growth engine gets 9% ROI and the market gets 6% then all the worlds capital gets channeled into the 9% growth engine. Especially when it's general purpose and can do everything so to speak.
Capitalism will drive the use of these ais the moment they are past AGI level. It's just a matter of reaching it.
Unhinged capitalism is a disease, toxic
I don't disagree but that wasn't the point of my comment
I was trying to demonstrate that slow takeoff scenarios post AGI are unlikely.
Sure, fair enough. Anyhow I think we’ll use AI everywhere like electricity (as it helps us to automate our daily tasks), without any AGI yet. If the AGI ever appears in this planet, it happens like an explosion but on the foundation that’s already there (like accelerated artificial evolution)
The "unhinged" is redundant
I'll just note that this prediction hinges on capitalism being basically unassailable by AI.
Could be a totally fair bet
Would companies that don’t adapt just end up becoming less profitable and go out of business
yes, but again over time. it can take years for some buisnesses to feel the effect of competition, and it can take some companies decades to go out of buisness because of a lack of adaptability.
A lot of obsolete jobs probably exist because of pdf’s.
I am in the same boat. Ive been following machine learning/AI for 5 years now and its picking up speed so quickly. This past year alone felt like the beginning of the climb
Same dude, it feels very much like that second when the pilot finally starts take off.
"Welp, no going back now..."
I've been following it for close to 20 years and this acceleration we've been seeing in the last couple of years. Hell even just this last year.
It's fucking mindblowing.
[deleted]
Remember that technology advances in an S curve. First there is the current paradigm, then a major advancement causes rapid change. After that change has been explored, a new paradigm takes hold and progress slows down.
Right now looks very exciting like how the late 2000’s and early 2010’s looked very exciting. And the world did change drastically after the 00’s-10’s tech revolution. But the rest of the 2010’s was somewhat sleepy in comparison to that era.
We are now on a massive upward slope due to massive advances in machine learning. The exponential beginning of the new paradigm shift has begun, and will likely stagnate somewhat in a few years. We will get way more advanced in the coming years than most people expect, but less so than many people on this sub would hope for.
This. On a long time scale it looks like a neat exponential line, but in reality there are many plateaus.
Remember how we had the 3D printing hype and everyone on Reddit was saying that in 5 to 10 years everyone was printing all their plates and cups. Then it turns out there are quite a few issues to get to that last step (ultrafast printing with incredible precision is still very far off for example, especially for a decent price)
As these AI models get better and better I personally think we will hit a gap where we just can't get to the next step. (for example reliable, well reasoning AI that can be used for purposes where failing can be costly or dangerous) .
>Remember how we had the 3D printing hype and everyone on Reddit was saying that in 5 to 10 years everyone was printing all their plates and cups.
https://www.cbc.ca/news/canada/3d-printed-guns-canada-increase-1.6708049 Yea not like we have literal warehouses full of 3D printed gun parts with ability to print receivers with metallic plastics. Even then the plastic receivers can literally pump out thousands of rounds until they begin cracking. Not like any legit revolutionary movement could use that though, let alone criminals!
It's not like China happens to have 1/3 of the AI companies in the world and is set on their 2030 goal. 2024 being the 75th anniversary of the PLA in time with US presidential election that is bound to be the most polarizing year yet, which is cool cause its not like foreign enemies take advantage of this polarization for their own autocratic means.
Nor are there complete rogue states like North Korea or Iran who have in the past spent millions on counterfeit production, nuclear weapons and cyber attacks.
It's not like the Manhattan Project was so top secret that not even the US Congress knew about it until the bombs dropped.
AI and ML is just like NFTs, sure it gets a lot of traction now but it's not like it can be used for practically any industry, let alone big corps like Amazon, Tesla, Microsoft who just value their human labor WAY more than cheap, efficient computer automation!
We all should fully expect an AI winter because this shit is just so gimicky and its not like the entire globe is getting in on the tech to keep up with an ever expanding field of cybercrime!
Too bad there isn't a way to learn all this stuff for free before making an informed opinion on the carrier pigeon service we're using... OH WAIT
>Remember that technology advances in an S curve.
Single technologies do, not the whole of technology.
I think the bottleneck is the speed at which society CAN change. Put a different way, how fast we can turn the wheels or put into action the technology and innovations offered to us by AI and quantum computing advances. It will be a feedback loop, but it will likely take a bit of time to get off the ground. That said, I think your timeline could very well be accurate, but I tend to think it’ll be more towards the latter end.
I suppose it would depend on which 'we' that the question is addressing. Certainly it seems like most 'average' people are still relatively unaware of how fast these kinds of things are advancing.
I think however much AI actually improves from here, we've definitely reached a point where it is going to start rapidly changing the world, if only because more and more people are rushing to start messing around and experimenting with AI in all their myriad creative ways.
Yes.
Deep learning started to work in 2012 thanks to GPUs. It has been a decade. I don't expect the trend to continue into 2030 unless something changes that. But we will be left with a diverse ecosystem of AI services. This will create more billionaires, but even more paupers. Unless we manage to democratize AI. Unless it becomes open source and easy to use for everyone on Earth.
Unfortunately it is. I would even risk saying that Kurzweil timeline might be wishful thinking.
Also, stuff moves fast, and change is faster than it used to be. But hype is even faster than actual change in the world. One has to consider that digital space can change quite fast, but physical world is very slow to change...
Projecting fantasies like complete post scarcity or a global UBI is pointless. A AGI would be like adding a lots of smart humans to the planet, anything less than AGI will give us the tools to do more ourselves. That's it. We'll still have climate disruption and ressource scarcity, for example. Political gatekeeping also won't stop, not at least because of the former mentioned things.
It doesn't stop at AGI. Since AGI is as smart as a human, and humans created AGI. Then i see no reason why AGI woudnt be able to improve itself until it eventually becomes ASI
I didn't claim it would. Still doesn't work like magic and it won't happen overnight.
No, we’re in the endgame, yes, I’m certain.
lmao.
We'll reach a plateau soon. Progress is not linear.
Funny, I see us as just now getting off that plateau that we have been stuck on for a bit over a decade now
One plateau is followed by another.
Both of you guys say very strange things which are false.
Primo2000 t1_j4p9vy3 wrote
Maybe from technological standpoint but it will take time for humans to adopt it. You have long term contract signed and whole business models centered around people so this will take some time and when it comes to biology, medicine etc there are blockers such as FDA that will slow adoptions of new medicines a lot. Still i think we are reaching some kind of treshold point when things will really start to take off