Submitted by TheHamsterSandwich t3_yqkxx7 in singularity
stofenlez557 t1_ivot8cf wrote
Why do people on this sub seem so much more confident in their predictions than everyone else? It feels like more than 90% people here believe that there's a very good chance of AGI appearing before 2040, while in real life and other subreddits it's probably the complete opposite. What is it that people on this sub know that nobody else in STEM is aware of?
nblack88 t1_ivp4dac wrote
There are three things to unpack here that I think will better answer your question:
- Bias. Many people who believe the singularity will occur also believe it will occur sometime around 2045. It is commonly believed that AGI is a necessary precursor to the singularity, and many of the popular experts in the field believe we'll have AGI of some sort between 2035-2045. There's a member of this subreddit who helpfully chimes in with a list of each expert and their predictions. Wish I could remember their name, so I could tag them. Bias also works in the opposite direction. Negative bias permeates every facet of our culture, because we have a 24/7 news cycle that perpetuates that bias to make money. We believe everything is getting worse, but it's actually getting better in the long-term.
- Predictions. We're pretty useless at predicting events 20 years or longer into the future. 10 years is exceedingly hard. I was alive in 96'. I didn't imagine smart phones in 06. I thought it would take longer. There's a lot of evidence people can cite to support their positions for or against the date for AGI. Truth is, nobody knows, so pick whichever one aligns with your worldview, live your life, and see what happens.
- Choice. Speaking as someone who believes the technological singularity is coming...it's more fun. Can't tell you when, or how. It just means I live in a more interesting world when I choose to believe we're headed toward this thing. Nobody else in STEM is any better at predicting the future in 20 years than anyone else. So each group could be right or wrong. Probably both.
cloudrunner69 t1_ivouzd1 wrote
Multifaceted exponential growth. The S curves are feeding off the S curves.
ihateshadylandlords t1_ivov2ns wrote
> Why do people on this sub seem so much more confident in their predictions than everyone else?
I think it’s because this place (like most subreddits) is an echo chamber.
>What is it that people on this sub know that nobody else in STEM is aware of?
Good question. In my opinion, I think people put too much stock into early stage developments. I think there’s a good chance that most of the products/developments that get posted here daily won’t go that far.
phriot t1_ivp1rfd wrote
> I think people put too much stock into early stage developments.
Also, I'd say that thinking the Singularity will ever happen pretty much implies belief in the Law of Accelerating Returns. Combine that belief with the excitement over early progress you mention, and it's not surprising that people here are highly confident that AGI will happen any day now.
Personally, I do think we're coming to a head in a lot of different areas of STEM research. It certainly feels like something is building. That said, I work in biotech, so I know how slow actual research can be. FWIW, my guess is AGI around 2045, small error bar into the 2030s, large error bar headed toward 2100.
AsuhoChinami t1_ivpziqa wrote
How the fuck is this place an echo chamber? le intelligent, mature, rational skeptics like yourself are in the majority here. Does the fact that 10 percent of people here disagree with you really get your panties in that much of a god damn twist? Does absolutely everyone here have to be on your side? It doesn't count as an echo chamber if everyone has what you deem to be the correct opinion, huh?
Russila t1_ivp90e4 wrote
I just go off of what the best experts in the field are saying. Listening too a lot of Lex podcasts and reading articles on it a lot of experts seem to be saying 10-15 years. It's an appeal to authority argument, but I think if anyone knows what they are talking about its the people working on it.
phriot t1_ivpb5ya wrote
There could be a selection bias happening here, though. Researchers more excited about progress may be more likely to be willing podcast guests than those who are more pessimistic.
Russila t1_ivpdbd9 wrote
This is true. But if we scrutinize and doubt every single thing we hear then we wouldn't believe anything is true. There is a fallacy for every possible argument that can be made.
Do I think it will happen in 10-15 years? Based on what researchers are currently saying, yes. Could that change when new information is brought to light? Yes. We should base expectations on existing evidence and change them when that evidence shifts. Hopeless optimism and hopeless pessimism helps no one.
Regardless we should continue to accelerate the coming of AGI as much as possible in my opinion. It's potential uses far outweigh its potential downsides.
phriot t1_ivpht4o wrote
>Do I think it will happen in 10-15 years? Based on what researchers are currently saying, yes.
Most of what I have read on the subject links back to this article. Those authors quote a 2019 survey of AI researchers with ~45% of respondents believing in AGI before 2060. The 2019 survey results further break that down to only 21% of respondents believing in AGI before 2036.
I'm truly not trying to be argumentative, but I really think that it's less "a lot of AI researchers think AGI will happen in 10-15 years," and more "a lot of Lex's podcast guests think AGI will happen in 10-15 years."
Don't get me wrong, I love Lex as an interviewer, and I think he gets a lot of great guests. Doing some digging: out of 336 episodes, maybe ~120 have had anything substantial to do with AI (based on the listed topics, titles, and guests). Some of those episodes were duplicate guests, and in others the guests were non-experts. (There were a lot more AI people featured in earlier episodes than I remember.) This does represent more data points than the survey I reference by about 4X, but I didn't keep track of all of the predictions given during my initial listens. I'll take your word that the consensus is 10-15 years, but that still isn't a huge data set.
Russila t1_ivpiy2i wrote
This is true and here's the thing. It happens when it happens. None of us are divination wizards or prophets. We can only try to make guesses based on existing evidence.
What I do see very consistently across the board is people bringing AI timelines down. That makes me more optimistic I think.
AsuhoChinami t1_ivpywk4 wrote
Thanks for letting me know that you have absolutely no idea what you're talking about and that I should block you. I hope I never come across one of your absolutely horrid, moronic posts ever again.
phriot t1_ivq1gwt wrote
To the commenter that blocked me:
I can only see your comment if I'm not logged in, because you chose to run away instead of participate in a conversation. I am, in fact, not a moron, and would have probably changed my way of thinking if you could have shown me how I was wrong. Now, neither of us will get that chance. Have a nice day.
[deleted] t1_ivpg1g2 wrote
[deleted]
AsuhoChinami t1_ivq00fz wrote
Despite what disingenuous people like nblack would have you believe, "experts" who believe in 2020s AGI are absolutely not uncommon.
[deleted] t1_ivovw7x wrote
[deleted]
Viewing a single comment thread. View all comments