Submitted by questionasker577 t3_10nn3k3 in singularity

I remember what 2003 to 2013 felt like—and it seemed like breakneck speed:

  • Invention of iPhone and subsequent massive improvements of mobile applications
  • Creation and widespread adoption of social media (Facebook, Snapchat, Instagram, etc.)
  • Massive improvements in video game graphics and console hardware
  • Use of mediums to consume media such as YouTube that had never existed before
  • Massive growth of the internet

If you had asked me in 2013 how I thought 2023 would have looked, I would have thought we would be much further and have a whole new set of technologies that we use in our everyday lives.

Now, I sit in 2023 disappointed at the lack of progress. I still use an iPhone. I still use Instagram and YouTube. My video games don’t look much better than they did in 2013.

CRISPR, AI, and blockchain have all been buzzwords for the past 5 years but they haven’t yet crept into many of our lives in any meaningful way (with the exception of what OpenAI has been releasing)

So, what gives? And why is 2033 going to be much different than 2023?



You must log in or register to comment.

GayHitIer t1_j69qvvx wrote

S curves, quick progress then maturity and then a new paradigm, we are at the start of a new s curve.


AsuhoChinami t1_j69ri9c wrote

Right. This isn't anything new. The 1980s probably had more technological growth than the 1970s, for example.


Nearby_Personality55 t1_j6bieyj wrote

Web 3.0 and AI are reminding me a bit of the Wild West tech environment of the 80s, as someone who was around during the 80s


Ortus14 t1_j6apflo wrote

For clarity it's cascading S-curves. S-curves, on top of S-curves, on-top of S-curves, on top of the big daddy S-curve which started with the big bang and complexity began increasing with the formation of elements etc.


questionasker577 OP t1_j69u310 wrote

Does AGI necessarily abide by an S-curve given how unique it is relative to other technologies? I’m struggling to think through this


GayHitIer t1_j69ukyk wrote

Higher s curves.


questionasker577 OP t1_j69uzmg wrote

What does that mean?


GayHitIer t1_j69xhbh wrote

Exponential bigger advancements, AGI will be a huge S curve maybe so big it will just look like a line.


questionasker577 OP t1_j69xxlr wrote

So an exponential growth curve rather than an S-curve?


Hotchillipeppa t1_j69yw79 wrote

It might still technically be an "s curve" but the curve up with ai will be so high that the s shouldnt come down for a long while.


Good-AI t1_j6dzdrk wrote

It might be a J curve. The first and the last one.


GayHitIer t1_j69yran wrote

Combination of the two. But still s curves.


Nmanga90 t1_j6b2rkf wrote

Definitely an S curve still. Looking back at progress, it has been very, very slow until now. Basically the invention of the transformer changed everything


Ortus14 t1_j6apv1z wrote

All technology abides by S-curves, all life (including Ai), and all evolution.

In evolution the start of a new S-curve is called punctuated equilibrium.

In computational theory it has to do with breaking out of "local maximum". In game theory it may be referred to as breaking out of a "equilibrium".

It's important to note that these are all cascading S-curves. That is to say, smaller S-curves on-top of larger S-curves, which themselves are on top of larger S-curves. If you ever think progress is slowing down, zoom out.


Prayers4Wuhan t1_j6dg0dd wrote

This is the right answer and if anyone is interested it’s also why index investing works.

The entire economy does not grow exponentially with every company benefiting. Many companies fail and go to zero.

With buying the whole market you don’t have to guess who will fail or where the next S curve will come from.


AsuhoChinami t1_j69p5xj wrote

Because it was. 2006 to 2012 (rise of smartphones, social media, streaming, transition of the internet from an occasionally useful tool to an addiction) was the last period of rapid change. 2012 to 2022 was a preparation period for the next period of rapid growth that began in 2022.


berdiekin t1_j6adymm wrote

I see the 2010s as a decade of maturing the technologies of the late 2000s.

Suddenly humanity had this massive influx of online users, and with them mountains of data, everyone was now taking pictures, filming, streaming, ... and sharing it online through social media.

Data sets exploded, so much so that a new branch of data management was called into life: Big Data. When I was in uni around 2010 that was one of the hottest topics. Because all these companies now had stupendous amounts of data but were unsure how to process it or even what to do with it.

On the commercial side there was hope it could be used to better target ads, to better predict what customers want.

Talks (more like whispers) were starting to float that maybe, just maybe, these grand new datasets could help us get better AI systems. Perhaps some day have systems that were better than humans at things like image recognition.

What I mean to say, in short, is this: The 2010s taught us how to process lots of data. And we're now starting to see that bear fruit.


sgjo1 t1_j69ymno wrote

I think much of the progress was under the radar and out of public view. OpenAI was founded in 2015 and just reached mainstream consciousness with their product launches. Also in 2015, I met people working on LLM and NLP at Google, and it appears Google was reluctant to release some of this tech but now they want to since they want to compete with OpenAI.


Practical-Mix-4332 t1_j6a7l0b wrote

Also biotech and DNA sequencing tech has been improving exponentially. The cost of full genome sequencing is down to $100 from $10,000 in early 2010s and millions in early 2000s.


Inevitable_Snow_8240 t1_j6absx0 wrote

Video games most definitely look much better than 2013, lol. Like a whoolllllle lot better. Even since 2018 there have been huge leaps. Try looking for YouTube videos that illustrate this.


Akimbo333 t1_j6b7zsh wrote

Oh yeah I agree


Steven81 t1_j6cfkit wrote

I still think that a 16 year old game like Crysis has no place to look as good as it does today. A semi open world with fully destructible environment, tactics from your opponents and one of the best jungle environments to this days, with probably some of the best explosions to this day...

Contrast it with with 1991 games vs it. Graphics and videogame mechanics have definitely slowed, by a whole lot. There was no new paradigm to follow the one that brought us to the mid '00s...

Having said that , it is to be expected. The industry has matured, you need exponentially more money with very little return (both in results and money from said results). It's the prototypical S curve.

Next step in videogames would only happen when we change the medium (say being in them instead of controlling within a screen). By then there would be a fresh reason to progress fast... we are not there yet.


kindasad22 t1_j6d169l wrote

2018? Nothing looks better than rdr2 and God of war yet


Specialist-Pie8423 t1_j6ec1yl wrote

Honestly what's amazing to me is that God of War (2018) already looks a little aged, with how light bleeds through certain objects. The lighting in God of War Ragnarok looks much better. Also don't forget about Cyberpunk 2077 and the 3090 and 4090 graphics cards.


awhitesong t1_j6d2k9o wrote

Nvidia DLSS, DLDSR, Unreal Engine 5, Ray tracing, Motion capture (see Horizon Forbidden West), Mobile gaming, Open world boom, Streaming, VR, Cross play, etc. Gaming has indeed improved a lot.


iNstein t1_j6awpms wrote

Could you tell me what technologies changed in the field of chemical engineering from 2003 to 2013 vs what changes occurred between 2013 and 2023? How about in mining? Metallurgy? Shipping? Logistics? Battery tech? Automation? Farming? agriculture? Computational analysis? Food science? Marine biology? Packaging tech? Geology? Construction etc etc.

You only know the tiny tiny tiny little bit that you directly experience every day. When tgat gets disrupted them of course you notice, smart phones and internet apos grab your attention. You do not see all the other stuff that happens in the background to all these other fields because you are not looking and wouldn't understand even if you did.

You also conveniently left out stuff that you should have noticed like the huge advances in space tech (reusable rocket), electric cars, battery tech, renewable tech, cheaper solar and wind and obviously AI which is head and shoulders more important than anything else that has ever been done.


r0cket-b0i t1_j6c48w9 wrote

>ed between 2013 and 2023? How about in mining? Metallurgy? Shipping? Logistics? Battery tech? Automation? Farming? agriculture? Computational analysis?

Exactly that! - extreme case of cherry picking/

If I judge by evolution of a toaster in past 20 years, oh boy the progress is really shit, toaster still toasts 2 slices of bread (while our exponential expectations were about toasting 1 billion slices) it still does not talk, does not walk and does not please you sexualy, still requires electricity and all - zero progress, how can one expect any thing to come out from AI, LEV or fusion if toasters are still like that.


TheOGCrackSniffer t1_j6coq3m wrote

if we're being technical toasters from 20 years ago pale in comparison to some of the toasters we have today


gangstasadvocate t1_j6d3rro wrote

Really? I thought there have always been bad and good toasters, the restaurant quality ones with a good clarified butter for the industrial type of toast, and then the residential ones.


visarga t1_j6atn4r wrote

Three generations ago, people managed without electricity, fridge, TV and running water. Two generations ago we got TVs and computers but no internet. The last generation grew up with internet. And now kids can have AI. Physical changes dominate in the first part and informational changes in the second.

But some products are mature and excellent, so we can't expect progress there. You can't improve audio quality by higher sampling rate, 44Khz is sufficient. And retina displays are already at the limit of visual acuity. Videos with more than 60-120fps are already too smooth to tell any improvement. Other devices have been great for decades - house appliances, etc. Food can't be improved since we've been optimising at it for too long. Digital content is already post-scarcity, we can find anything, and now we can generate anything. So AGI will have to deliver on top of these things something else, the low hanging fruits have been picked.


DarkCeldori t1_j6bdehu wrote

Food can still be improved. Bread and fruit that remains good for months or years. Also polyphenol content can be increased and sugars replaced with natural zero calorie sweeteners through genetic engineering. Animals could be modified to be high in healthier monounsaturated fat rather than saturated fat.

As for ai there are technologies such as robots and humanoid biodroids that are possible also full immersion vr through connections to the brain. Enabling regeneration curing cancer and aging. Allowing for brain transplants. Also true nanomachine tech, appliances and devices made with true nanomachine tech can self repair self clean and can last for billions of years if they have energy provided.


ajahiljaasillalla t1_j6bjco4 wrote

I think in Japan tomatoes have been modified by CRISPR to contain more GABA acid and those went into the market a year ago. And those tomatoes were the first food product modified by CRISPR in the market. Also soy beans have been modified to contain more oleic acid. So food industry is going to change by CRISPR as it's easier and more precise to use as older GMO technologies and it's not GMO but gene editing.


pyriphlegeton t1_j6cqopu wrote

Fyi, the last "A" in "GABA" already means "acid" so "GABA acid" would be redundant. :)


PreferenceIll5328 t1_j6d7nd0 wrote

120 fps is nowhere near the limit that we can tell and 60 fps is downright choppy


Trumaex t1_j6atywq wrote

> My video games don’t look much better than they did in 2013.


Do you still play video games from 2013? Or maybe you have older hardware? Just compare anything released there to any AAA game released recently on ultra setting, with real time ray tracing.

I'm in gamdev space... and the tech in 2013 doesn't even come close to what's available right now, and for free. Just look at Unreal Engine 5 demos or even Unity HDRP demos.

In 2013 I couldn't put my VR headset and play Half Life in it.

End so on, and so forth...

What overall I have a feeling, it's not the matter of progress, but matter of your perception of progress. Maybe you were a teen in 2003-2013? Those years usually feel golden to us, but it's just a cognitive bias.


TinyBurbz t1_j6bcp0w wrote

>I'm in gamdev space... and the tech in 2013 doesn't even come close to what's available right now, and for free. Just look at Unreal Engine 5 demos or even Unity HDRP demos.

We have been an a bit if a lul until recently. Raytracing is neat and all, but it was lipstick on a pig. Fact is, graphics have been improving a lot slower than they used to.


About UE 5.1, Lumen and Nanite have changed everything, however.


GanjARAM t1_j6b4h2m wrote


mcilrain t1_j6bcq9o wrote

That tech isn't going to be used in a substantial way on anything that supports 8th gen consoles.


LightVelox t1_j6dh7yu wrote

Is is already used in games like Fortnite, and pretty much any game made in Unreal 5.1 could use it without losing too much performance


genshiryoku t1_j6a85jx wrote

Because Moore's Law largely stopped around ~2005 when Dennard Scaling stopped being a thing. Meaning clockspeeds have hovered around the 4-5Ghz rate for the last 20 years time.

We have started coping by engaging in parallelism through multi-core systems but due to Amdahls Law there is a diminishing return associated with adding more cores to your system.

On the "Instructions Per Cycle" front we're only making slow linear progression similar to other non-IT industries so there's not a lot of gain to be had from this either.

The reason why 2003-2013 feels like a bigger step is because it was a bigger step than 2013-2023. At least from a hardware perspective.

The big innovation we've made however is using largely parallelized GPU cores to accelerate machine learning on the extremely large data sets large social media sites have which has resulted in the current AI boom.

But yeah you are correct in your assessment that computer technology have largely stagnated since about ~2005.


hopelesslysarcastic t1_j6ag8v6 wrote

So what is your opinion on the next 10-15 years given your comment? Just genuinely curious as I haven’t heard this argument before and it’s fascinating


genshiryoku t1_j6ahc38 wrote

I think the next 5 years will be one of explosive AI progress but sudden and rapid stagnation and an AI winter will follow after that.

The reason I think this is because we're rapidly running out of training data as bigger and bigger models essentially get trained on all the available data on the internet. After that data is used up there will be nothing new for bigger models to train on.

Since hardware is already stagnating and data will be running out the only way to make progress would be to make breakthroughs on the AI architectural front, which is going to be linear in nature again.

I'm a Computer Scientist by trade and while I work with AI systems on a daily basis and keep up with AI papers I'm not an AI expert so I could be wrong on this front.


visarga t1_j6arwxp wrote

Generating data through RL like AlphaGo or "Evolution through Large Models" (ELM) seems to show a way out. Not all data is equally useful for the model, for example problem and task solving is more important that raw organic text.

Basically use LLM to generate and another system to evaluate, in order to filter the useful data examples.


DarkCeldori t1_j6bc0b8 wrote

The brain can learn even with few data. A baby that grows in a mostly empty room and hears his parents voices still becomes fully competent within a few years.

If ai begins to use brain like algorithms given it does millions of years of training, data will not be a problem.


PreferenceIll5328 t1_j6d98c1 wrote

The brain is also pre trained through billions of years of evolution. It isn’t a completely blank slate.


DarkCeldori t1_j6db3zz wrote

Iirc only 25MB of design data for the brain lies in the genome that is insufficient to specify 100~trillion connections. Most of the brain particularly the neocortex appears to be a blank slate. Outside prewiring such as overall connectivity between areas it appears it is the learning algorithms that are the special sauce.

There are plenty of animals with as much baked in and they show very limited intelligence.


GoSouthYoungMan t1_j6c4zym wrote

But the brain appears to have massively more effective compute than even the largest AI systems. The chinchilla scaling laws suggest we need much larger systems.


DarkCeldori t1_j6cxa7w wrote

I don't think the brain's prowess lies in more effective compute but rather in its more efficient algorithms.

IIRC mimicking brain sparsity allowed ANN to get 10x to 100x more performance. And that is just one aspect of brain algos.


BehindThyCamel t1_j6aj21s wrote

Do you think that 5-year period of progress will include training models on audiovisual material (movies, documentaries, etc.), or are we too far technologically from the capacity required for that, or is that not even a direction to pursue?


DarkCeldori t1_j6bge0g wrote

Moores law is about miniaturization of transistors and doubling of transistor count. It is true we also used to get significant clock speed increases that we no longer do. But moores law didnt stop it only slowed down from every 18 months to every 2.5 years or something like that this happened last decade as a result of constant delays in the development of extreme ultraviolet lithography equipment but that is now solved and it is back to every 18 months iirc.

But thanks to moores law and koomeys law continuing we have seen constant increases in energy efficiency and computational power.

We are indeed facing some significant issues still some parts such as sram which is vital for cache sizes iirc have stopped scaling. Also it seems the reduction in cost per transistor has slowed or perhaps even ended recently. Microsoft estimated they wouldnt get cost reduction from moving to newer smaller transistors and thus chose to do two versions of xbox a cheap and an expensive one from the start.

If cost reduction is not solved we could be in serious trouble. As clearly a doubling of transistors requires at least a halving of transistor cost to be viable.


NefariousNaz t1_j6ad1ad wrote

Technological change and disruption in consumer products and use was massive between 2003 to 2013. From 2013 to present it was more maturing and defining of products and services that were already introduced.

We might be entering a new phase of massive consumer disruption with the likes of openchat ai, stable diffusion ai and other related tech.


r0cket-b0i t1_j6c316u wrote

Why did 2003 to 2013 feel like more progress - because of OP's lens / myopic view / confirmation bias.

Lets see the facts and debunk it:

  1. Invention of an iPhone - randomly pick any PDA, say Cassiopeia A-10 that was released in 1997! thats 6 years before 2003, to people who owned it an Iphone was just "another evolution" and for longest time "not a real smartphone" because nokia was able to run Java apps one could install without a need for an app store. - what does it mean, iPhone nailed UX and amplified with a power of Brand and Marketing, the revolution started far before thought.
  2. Social media revolution... ICQ gained wide spread popularity in 1998....
  3. Have been happening every year since video games existed, from 3dfx voodoo cards to real time shadows, there were no exact slow down or acceleration - its pretty much an compound growth graph.
  4. Video streaming? Quicktime or first video calls ? again broadband speeds increase similarly to video game graphics this trend was not different if you zoom out from 2003 or 2013.)
  5. you can see for yourself that there is minor difference with mostly only North America getting to a close to maximum penetration.

This is very clearly a very narrow, consumer electronics and experience focused view, I am openly criticizing it not because I want to hype the short singularity timelines but simply because it is not factual, its cherry picked, apples vs oranges + industry focused bias derived.

- 2013 to 2023 things that actually happened from Crispr, to us having CPUs fabricated on five to ten times smaller nanometer scale vs those in 2013 are a massive progress.

- 2016 AI wins at GO - this felt like humans landing on the moon or going to space for the first time.

- see number of records in Fusion energy development can make a very long list of things that are evolutionary (like iPhone) but also revolutionary an not expected (like Ai progress, Fusion, etc) for every decade and if anything there is acceleration of progress not other way around.

I would love to be proven wrong but in a constructive manner not in biased tunnel-vision way lol ;-)


Ishynethetruth t1_j69xni9 wrote

I know how you feel. it’s because of greed. Why risk introducing a new product when you can change its colour and sell it again. You have YouTube tech viewers marking the same video about the same product for the last 5 years. The only company trying is meta , you can hate them all you want but they are the only company right now taking a risk for the future


TopicRepulsive7936 t1_j6a14pn wrote

People will give you all kinds of strange answers but the truth is that it's the difference between subjective and objective change and your chosen perspective on it all.


IronJackk t1_j6azhpr wrote

Things transitioned from analogue to digital and the internet became mainstream.


monsieurpooh t1_j6c2f75 wrote

IMO 2015 is when the big shift happened, which is after 2013.

I argued with my machine learning friend about neural networks. She claimed that neural networks were "for losers" and not getting anywhere because they required too much data. This was on the heels of the fact that they passed a critical test which, in the past, was postulated as a "test for AI consciousness": Captioning an image. Basically, constant goal-post moving.

It was also on the heels of AlphaGo's victory, which was deemed by most CS experts at the time as impossible or improbable in the near future.

tl;dr 2015 was the year AI proved all the naysayers wrong IMO. And it came after 2013.


CypherLH t1_j6cumit wrote

The only thing that matters right now is development in AI. Its moving faster right now than any tech development I have seen in my lifetime and I've been following tech closely since the late 80's. And we're clearly getting into the sharp exponential phase of the s-curve on current AI model development.

The closest comparison I can think of is the internet in the mid 90's when it was massively improving month by month and doubling home connection speeds every 6 to 12 months, etc. Current AI development seems faster than that, and more consequential...the amount of progress just in 2022 alone was simply stunning. And now less than a month into 2023 we already have a text-to-music model demonstrated. This year is going to be wild.


sunplaysbass t1_j6avlw2 wrote

2023 / 2024 will be a turning point


mcilrain t1_j6bceho wrote

> My video games don’t look much better than they did in 2013.

8th gen consoles were underpowered so other than a bump in resolution from 720p -> 1080p there wasn't much of an appreciable difference. 9th gen consoles didn't sell well so games are still made to support 8th gen consoles.


giveuporfindaway t1_j6bhwgc wrote

To be honest I don't think any of the above was ever ground shaking. Facebook is just a website (not sure why anyone ever called it anything else). The biggest day to day change in my life was getting netflix so I didn't need to drive to the video store anymore. But that's hardly revolutionary. I suspect between 2023 and 2033 we'll start to see significant labor displacement. I also suspect porn/vr will have an interesting impact on the dating market.


roland333 t1_j6bypb9 wrote

Check out the history of Deepmind. That should turn that frown upside down


Gilded-Mongoose t1_j6c1ett wrote

2003 - 2013 was breakthrough / proliferate into mainstream era 2013 - 2023 was fine tune era and developing less visibly or universally applied improvements.


Buttafuoco t1_j6c1mvx wrote

Quantum computing is hitting production today we will see some pretty impressive tech this next decade


PlanktonBeginning361 t1_j6c45x5 wrote

To be fair, we’ve dialed in the quality of all those products by quite a bit. What was experienced before was simply the birth of the internet in peoples pockets which is very hard to beat. We’re leagues ahead of ten years ago in performance.


MrCensoredFace t1_j6c7al3 wrote

We did progress a lot. Tech has become wayy more accessible and faster and software is reaching new levels of power with AI. Basically there has been a lot of refinement, which is why you couldn't notice.


glutenfree_veganhero t1_j6cgt1s wrote

The list you made was application of already known paradigms. Last 10 years are us stumbling into the unknown and developing new tools. So in a sense more real progress.


Redditing-Dutchman t1_j6cqbvs wrote

Just a lot of progress in consumer electronics during that time, so it's very visible.

Lots of progress in the last few years for certain diseases for example, like multiple sclerosis. Which is now decently treatable with a new procedure. But the medical and industrial fields are quite 'invisible' in daily life for most people, hence you don't really notice the progress.


bemmu t1_j6cw333 wrote

Agreed there were more consumer-facing things affecting our lives in your earlier timespan.

I'd include these at least in your timespan:

  • 2016 SpaceX succeeds landing a rocket booster back, making access to space more affordable.
  • 2017 Transformer-based deep learning models, making possible GPT-3, ChatGPT, also used in Stable Diffusion.
  • 2019 Oculus Quest makes VR a lot more mainstream.
  • ~2020 AlphaFold, can now predict how proteins fold. A problem which seemed intractable before and will likely lead to many medical breakthroughs.

Also during the last few years electric cars have become much more popular.


SmoothPlastic9 t1_j6cws6e wrote

its a baseline for like the internet which is basically controlling peoples life


Jaded-Protection-402 t1_j6cy50b wrote

You only see the foreground superficial things - in the background things did improve, by a lot!


Ohigetjokes t1_j6diitd wrote

Your feelings are a terrible metric for objective observation.


Villad_rock t1_j6dl529 wrote

All the things you mentioned needed decades of research. They didn’t just appear out of nowhere between 2003 and 2013.

I think touchscreens and the internet were invented in the 60s and 80s.

First you have research, invention in all kinds of fields and when it matures and come together you have an application explosion.

It doesn’t mean that you had more progress during 2003-2013.

We had most likely less scientific progress than the decades beforehand.

The same happens with new technologies like ai, biotech, vr, nanotech etc.

Every important progress behind close doors will lead to an explosion when everything matures and come together.

Doesn’t mean when the explosion happens that we actually will have more progress during that time.

People always talk about fusion is 20 years away but we did make progress which was in material science.

Fusion needs innovations in material science which can take decades because the materials you need are pretty high tech.

Don’t be a fool and think when fusion happens and change the world that we suddenly made progress, would be disrespectful of the decades of work beforehand which was more important and harder.


mquintos t1_j6dlrf4 wrote

Human rights, police accountability, decreased racism (all within the last 5 years)


No_Airline_1790 t1_j6dy7di wrote

Relativity. New inovations of invention may have been lacking to you but technology moved at a dizzy pace since 2013 to 2023.


questionasker577 OP t1_j6eiulo wrote

I’m sure you’re right but it just doesn’t quite feel like it


No_Airline_1790 t1_j6f7lid wrote

What version of the iPhone you using? Go look at the technology today in that phone compared to it even 5 years ago.

There are things your phone can do it couldn't do 3 years ago. Technology isn't about feeling.


questionasker577 OP t1_j6fc5fs wrote

The iPhone has had marginal improvements with each recent iteration—mainly the camera

The iPhone, in its earlier versions, had massive improvements with each new iPhone


Exel0n t1_j6bs8m5 wrote

the blockchain hype sucked a lot of money to that scammy, non-productive blockchain eco system. its partially to blame. had that money been to AI, world would be much more different.

just think about that FTX scam, how much billions could have gone to AI instead of funding the lifestyle of that scammer and his acomplices?

the blockchain hype also caused GPU price to skyrocket, causing AI development, which also use GPUs, to starve due to way more expensive GPU price as well as often non-existant stock due to miners robbing all the supplies


raylolSW t1_j6bwj4y wrote

Almost 10 years later I consider AC: Unity to be the best looking game…


Kaje26 t1_j6b47mj wrote

Because it was? Because unfortunately scientific progress is reaching a road block?


No_Ninja3309_NoNoYes t1_j6a7t2u wrote

AI will be bigger in 2033, but I am afraid that it will run out of steam. The neural networks that are built today are like ladders to the moon. We need rockets and some sort of fuel. But I bet that if someone figures it out, it will seem pretty obvious in hindsight.

The rest is politics and tradition. Almost no one can compete with Silicon Valley. Some governments try, but it is not a priority for them.


Phoenix5869 t1_j6a6nfu wrote

Most people dont like to admit it, but technology is slowing down significantly. Cures for aging or even treatments for it, nanobots, curing blindness / paralysis / alzheimers etc, lab grown / artifical organs etc or even regrowing teeth or a cure for the common cold are still decades away despite years upon years of ‘breakthroughs’.

im sure if you asked someone in 2013 what they think we would have in 2023, they would give an answer that seems ridiculous to us today. But no, 2023 is the same as 2003 except for smartphones, tablets, etc, better computers, and a few primitive gene therapies. What does that say about what 2043 or even 2053 will be like?

and if you dont believe me, go ask basically any expert in the relevant field(s) how far away even the simplest of these technologies are. You probably wont like what they tell you


DarkCeldori t1_j6beu4r wrote

What are you talking about? Ca akg preliminary data appears to show it reverses epigenetic age by years, and epigenetic changes appear to be the cause of aging. Resveratrol basically halts age related changes in gene expression in the heart, keeping it young indefinitely.

Sinclair is bringing blindness treatment to clinical trials within 1 or 2 years iirc.

Alzheimer progress was halted by melatonin in one case study in another it also halted parkinsons. Regrowth of teeth is already in animal trials. As for organs it is likely we can use embryonic development for that and do humanized chimeras in pigs, the research is already quite advanced.

Cancer within years a company doing transfusions from cancer immune humans to normal humans will bring a product to market. There are also nanoparticle sponges from another company that appears highly effective.

True nanobots are likely to be the result of advanced synthetic biology using unevolvable designs. Recently ai has allowed for zinc finger design which will enable the edition of the genome at arbitrary points greatly accelerating progress. Also ai has beem able to predict many existing proteins and design novel ones with novel functions iirc just exactly what we need for nanobots.


Phoenix5869 t1_j6evlad wrote

>Ca akg preliminary data appears to show it reverses epigenetic age by years, and epigenetic changes appear to be the cause of aging. Resveratrol basically halts age related changes in gene expression in the heart, keeping it young indefinitely.

OK this is good, I didn't know that

>Sinclair is bringing blindness treatment to clinical trials within 1 or 2 years iirc.

Good, but unfortunately many promising blindness treatments fail in human trials. I hope it works obviously but I'm just warning you it might not

>Alzheimer progress was halted by melatonin in one case study in another it also halted parkinsons.

That's great, but again, many promising treatments fail later on.

>Regrowth of teeth is already in animal trials.

Regrowing teeth has been in clinical trials for decades

>As for organs it is likely we can use embryonic development for that and do humanized chimeras in pigs, the research is already quite advanced.

True, and from what I remember we are already using pig hearts as a scaffold, growing a patients own cells onto it to avoid rejection, and putting it into a patient.

>Cancer within years a company doing transfusions from cancer immune humans to normal humans will bring a product to market. There are also nanoparticle sponges from another company that appears highly effective.

Hopefully this works. In future we could work out how to make someone immune to cancer via gene editing etc

>True nanobots are likely to be the result of advanced synthetic biology using unevolvable designs. Recently ai has allowed for zinc finger design which will enable the edition of the genome at arbitrary points greatly accelerating progress. Also ai has beem able to predict many existing proteins and design novel ones with novel functions iirc just exactly what we need for nanobots.

I'm not saying nanobots will never happen, but we've been working on them for decades with little progress made.

Progress in genome editing is good but please try to remember that this is still in its infancy

Yh ai is helping a lot, they already designed a potential treatment / cure for a currently incurable and untreatable lung condition