Comments

You must log in or register to comment.

Fantastic-Climate-84 t1_it3gsnv wrote

2021 - https://builtin.com/hardware/moores-law

2020 - https://www.google.com/amp/s/www.wired.com/beyond-the-beyond/2020/03/preparing-end-moores-law/amp

2019 - https://www.google.com/amp/s/www.cnet.com/google-amp/news/moores-law-is-dead-nvidias-ceo-jensen-huang-says-at-ces-2019/

2018 - https://www.google.com/amp/s/steveblank.com/2018/09/12/the-end-of-more-the-death-of-moores-law/amp/

2017 - https://www.computer.org/csdl/magazine/cs/2017/02/mcs2017020007/13rRUypGGeJ

2016 - https://www.nature.com/news/the-chips-are-down-for-moore-s-law-1.19338

2015 -https://www.economist.com/the-economist-explains/2015/04/19/the-end-of-moores-law

2014 - https://www.businessinsider.com/great-graphic-is-this-the-end-of-moores-law-2014-1

2013 - https://www.pcworld.com/article/457384/the-end-of-moores-law-is-on-the-horizon-says-amd.html

2012 - https://techland.time.com/2012/05/01/the-collapse-of-moores-law-physicist-says-its-already-happening/

2011 - https://www.google.com/amp/s/www.forbes.com/sites/alexknapp/2011/03/30/the-end-of-moores-law/amp/

2010 - https://www.google.com/amp/s/techcrunch.com/2010/08/23/the-end-of-moores-law-a-love-story/amp/

2009 - https://archive.nytimes.com/bits.blogs.nytimes.com/2009/05/22/counting-down-to-the-end-of-moores-law/

2008 - was a good year, people were pretty optimistic

2007 - https://www.reuters.com/article/us-intel-moore-idUSN1846650820070919

2006 - https://www.indybay.org/newsitems/2006/05/18/18240941.php (had to switch over to bing to find results this far back)

2005 - https://slate.com/technology/2005/12/the-end-of-moore-s-law.html#:~:text=Dec%2020%2C%2020053%3A15%20PM%20Until%20recently%2C%20Moore%E2%80%99s%20Law%2C,a%20chip%20could%20hold%20a%20few%20dozen%20transistors.

2004 - some guy named moor had an issue with the Ten Commandments being infront of state buildings and legal battles ensued. Can’t find anything in two minutes here.

2003 - https://dl.acm.org/doi/10.1109/MC.2003.1250885

2002 - https://spectrum.ieee.org/the-death-of-moores-law-will-spur-innovation

This is a long way of saying “is it that time if year already”.

531

ItsAConspiracy t1_it3sum3 wrote

Maybe it depends on your definitions of "Moore's Law" and "end." From the article:

> In the 15 years from 1986 to 2001, processor performance increased by an average of 52 percent per year, but by 2018, this had slowed to just 3.5 percent yearly – a virtual standstill.

I'm feeling that. I got my first computer around 1986 and those first fifteen years were incredible. A new computer was way faster than one just a couple years old. RAM and disk space was growing like crazy.

Ain't like that anymore. I bought a Macbook Pro eight years ago and it doesn't even seem slow. New ones that cost about the same have the same RAM and just double the storage. This is not the Moore's Law we enjoyed in the '90s.

80

Fantastic-Climate-84 t1_it3ulbr wrote

I get it, really I do. I’m just being glib. As I’ve gone through a few of these articles each have made some variation of that point.

Weird Al had a song called “it’s all about the pentiums” which, in the late nineties, called out how if you took a computer home from a box store it was already out of date by the time you opened the box.

The consumer isn’t on the bleeding edge any more, but that doesn’t mean the end of moors law. It’s way better for us at this point. That the consumer isn’t being “punished” or forced to upgrade by the advances in tech is a great thing. The consumer being the backbone of the tech industry was never going to last, but we’re nowhere near dead in the water yet.

36

ItsAConspiracy t1_it3vr0x wrote

I wouldn't say it's better. Those years were tremendous fun. You could keep running your old stuff if you wanted, but if you had money you didn't because the new stuff was so much better.

20

Fantastic-Climate-84 t1_it3w9fz wrote

>if you had the money

17

Xist3nce t1_it4a2gy wrote

A 3 story house was worth what my car is now. Kids could afford a car on a part time job. Support a family of 4 comfortably on min wage. Everyone could have had money then.

13

SatanLifeProTips t1_it5lx02 wrote

A microwave oven was $1700 in 1970’s money. Now it’s $39.97 at walmart. Cars and houses got expensive. Everything else got insanely cheap. A new T-shirt is five bucks!

And you can furnish a home for $0 on craigslist free. If you are handy with a paintbrush you can actually furnish a home quite nicely. Moving out in the 80’s had you living with cinder block furniture (stolen from a local construction site). Now some students can equip a suite and live large.

Once your rent is covered everything else is easy. Live by a place with good mass transit and you don’t need a car. I live in a dense city with light rail. Modern E-scooters have 16km of range, can cook along at 50kph or faster (illegally but no one cares) and you can take them on the train. It’s brilliant. Wear good rain gear and you can commute faster than your car.

5

Xist3nce t1_it5ni8o wrote

How much money do you think living in a modern place is? Find me one that you can afford rent in for $7.50 and I’ll acknowledge it. My grandpa got his house for a month of work. I’m not even allowed to get a house because I don’t make enough even though the mortgage is way lower than my rent.

1

SatanLifeProTips t1_it5omb6 wrote

Here a basement suite or a one bedroom apartment will set you back $1200-$1700/mo but min wage is $15.65/hr CAD. That’s just minimum wage however and few work for that. Even mcdicks is paying 20+ or you can’t get anyone. And medical is free.

Buying a place is going to need a decent career. Housing is super expensive to buy in the cities and places with great transit.

America’s $7.50 min wage is basically 3rd world poverty. But that is a system designed to trap people in poverty.

1

Xist3nce t1_it5uqrq wrote

Bingo there. Only way out is to make an absurd amount of money, unfortunately if you have to work 40-50 hours to make ends meet it’s hard to give up rest to work on skills. It’s all a trap and the tipping point is coming.

1

Fantastic-Climate-84 t1_it4bq8r wrote

Their follow-up statement was that they didn’t have the latest and greatest, just enjoyed that it was out there. That said, inflation is a raging birch, I’m with you there.

0

ItsAConspiracy t1_it3xcvk wrote

I was making eight bucks an hour for most of that time, but it was still fantastic.

Now it doesn't matter how much money you have, you're still not going to buy that kind of performance leap every couple years. Everything's just gonna stay about the same, with just small incremental improvements.

That's the end of Moore's Law. We're going to be stuck with pretty much the same computers we have now, until someone invents a whole new computing technology that's not based on silicon chips.

4

Fantastic-Climate-84 t1_it3y2x5 wrote

Dude, now you’re being glib.

Families couldn’t afford a new computer every two years to keep up with schools, people struggled getting new laptops for universities, that you were able to afford it — shit, so was I — doesn’t make it ideal.

The way we work with computers over the last five years has dramatically changed, already. It’s now possible to work from your phone! You can hook up an adapter to an hdmi cable and run that to a tv, use Bluetooth devices for mouse and keyboard, and off you go.

I do 90% of my work from a tablet today. To do what I do, I would never have dreamed that possible.

You’re choosing to ignore the dynamic swing occurring, which is another element to every. Single. One. Of these articles.

1

ItsAConspiracy t1_it3zkfc wrote

Dude, I was making like a buck and a half over minimum wage. Don't tell me how awful Moore's Law was for people without money. I barely had any and thought it was fantastic. In any case, doesn't matter whether we like it or not, point is that it's gone.

As for phones, I have an iPhone 6s and my girlfriend has a 13, and they're not all that different.

But sure, people are still engineering clever new things. That's great, but it's not Moore's Law, which was an absolute tsunami of raw new computing power every year.

3

Sylvurphlame t1_it4h16i wrote

> As for phones, I have an iPhone 6s and my girlfriend has a 13, and they’re not all that different.

To an extent, I that’s because software developers have to account for people having older phones. Apps don’t fully utilize the performance capability of smartphones because they have to assume somebody has a three or four year old device.

Also, I kind feel like if you’re not noticing a difference between an iPhone 6S and a 13, either you just don’t ask much of your phone or your girlfriend is severely underutilizing hers. :)

3

Fantastic-Climate-84 t1_it411q9 wrote

Now you’re just being dishonest.

> As for phones, I have an iPhone 6s and my girlfriend has a 13, and they’re not all that different.

Really? Really.

> But sure, people are still engineering clever new things.

And what handles the computations and functions those new things? The absolute powerhouses that sit in our pockets — well, not yours, but other pockets.

Again, that you could say that you barely had enough money, but we’re buying a new computer/processor/gpu every two years — because that’s what it was to keep up from the 2000s to about 2016 — tells me you’re not being honest.

I’m hopping off this comment train.

2

ItsAConspiracy t1_it44kxf wrote

I didn't say I bought a new computer every two years. I said people with money did. Doesn't mean I sat around being depressed about it. I was still super excited to see it all happening, and I got to experience it when we upgraded at work, in addition to the few upgrades I managed at home.

And all this is a side issue to that measly 3.5% annual improvement we have now.

But please, yes, hop off, this is getting unpleasant.

2

ItsAConspiracy t1_it5362d wrote

Yeah that's great, but that's just regular technological progress. Of course that will continue. That's not the same as Moore's Law, which was a doubling of performance every 18 to 24 months over a long period of time. If there had been a Moore's Law for cars, they'd get millions of miles per gallon by now.

1

Fantastic-Climate-84 t1_it5554g wrote

The point was that, even with pistons, adding more doesn’t mean better performance.

It’s no doubt you don’t see a difference when you’re still using tech that’s almost a decade old. Try keeping up, and you’ll notice a difference.

That said, crazy that your MacBook and phone are still working and able to be used, hey? Sure is rough for the consumer these days. Couldn’t use a ten year old computer back in 2008, let alone a phone.

Bleeding edge cuts both ways. Ai, drones, tablets replacing laptops, laptops replacing desktops, phones being the bulk of where we compute, but you’re still complaining.

−1

ItsAConspiracy t1_it5ejgn wrote

Sure there's a difference. But in terms of sheer compute it's still just 3.5% annually, according to OP's article. That's not Moore's Law. Tech progress continues but Moore's Law is still dead until we get a whole new chip tech. It's not complaining to just recognize reality.

1

Key_Abbreviations658 t1_it7ng3r wrote

But if you didn’t you still had the same computer it’s not like your computer got worse you just had much better options.

1

Key_Abbreviations658 t1_it7ngzz wrote

But if you didn’t you still had the same computer it’s not like your computer got worse you just had much better options.

1

Plastic-Wear-3576 t1_it4qf2m wrote

Eh. Computer speeds have definitely improved in other ways. SSDs can make an otherwise slow computer fast.

It's like in video games. Games today in terms of textures don't really look much better than games from 5 or 6 years ago.

But lighting has improved immensely.

People will find ways to continue to improve, physical limits be damned.

2

Fantastic-Climate-84 t1_it4sj1h wrote

Totally agree with you.

Even if transistor count was stagnant, material science null, the design of the chipsets had gotten way more efficient. The boards are more efficiently designed, gpu and other system memory bottlenecks are just gone, kids these days don’t even talk about ghz any more.

Say what you will about the games themselves, but I’ve been able to play civ 6 on my phone for a few years now. To me, a gamer who remembers civ 2 not being able to play on a computer that cost twice as much as my current phone, it’s kinda magical.

2

Plastic-Wear-3576 t1_it4szic wrote

I ran into a scenario years ago when Starcraft 2 came out. I bought it, and it completely crushed my computer beneath it's boot.

Convincing my parents I all of a sudden needed a new computer was a stressful one.

Nowadays you just expect a game to run on your PC unless you have an older PC and the game is a true ship of the line nuts to butts eye watering game.

2

Fantastic-Climate-84 t1_it4ud7k wrote

> Nowadays you just expect a game to run on your PC unless you have an older PC and the game is a true ship of the line nuts to butts eye watering game.

Even then, today you just get the console version instead haha

I was selling computers when Doom 3 came out. That game made us a lot of commission. StarCraft 2, too. Kids like you were a big reason for our bonuses!

2

Evethewolfoxo t1_it3whfg wrote

I believe we’re kinda stuck in the consumer market and nearing the edge…for CPU’s and at least temporarily.

However, no one can deny that GPU’s have done nothing but improved year after year. I think that’s our current frontier in the consumer market while companies figure out tech like DLSS, RTX, and making transistors tinier.

7

ItsAConspiracy t1_it407ad wrote

Yeah GPUs are a bright spot. But partly it's because they're massively parallel and can just keep getting bigger and more power-hungry.

Another bright spot is neural chips, which aren't so much about Moore's law as getting better at specialized machine-learning architectures.

9

metekillot t1_it5nwvr wrote

Computer technology is only about a century old. I'm sure 100 years after they cast the first metal sword that they thought they were nearing the limits of metallurgy.

2

ItsAConspiracy t1_it78x1n wrote

We're definitely not nearing the limits of computation in general, just silicon chips specifically. We went from mechanical relays to vacuum tubes to silicon chips, now we need something else for the next big leap.

1

Cabana_bananza t1_itajcxw wrote

I think we will see the forerunners to computorium over the next twenty years. You have big companies like Element 6 (De Beers) working with others on creating better carbon semiconductors and researching use in computation.

The level at which they are manipulating the diamonds as they are developing has grown by leaps and bounds over the past 40 years. From the large x-ray diamond plates for satellites of the 80s to perfecting their ability to control and inlay imperfections onto the diamond structure of today.

Its starting to resemble what I think of when I pictured Kurzweil talking about computorium.

1

[deleted] t1_it4jq4w wrote

[deleted]

5

frankyseven t1_it4n9ci wrote

Umm, you realize that the M processors from Apple are incredibly fast and efficient, faster than anything else on the market. The new processors are a massive leap forward in processing and power management.

3

danielv123 t1_it4ps0y wrote

Faster? No. More power efficient? Yes. Amazing chips.

5

frankyseven t1_it4r0jg wrote

The M1 was faster than the I7 and whatever the AMD chip is called when it was released. Maybe not on paper but there were plenty of tests done around the time it was released showing that it was the fastest on the market and that was the Air not the Pro. Now, some of those speed gains are due to the OS being optimized for the chip and all of the other hardware but it was still the fastest on the market.

Regardless, Apple is one of the few companies really pushing the cutting edge with their computers.

−2

MadDocsDuck t1_it4woul wrote

Yes and no. The real problem here is already in the way that you perceive their marketing material because the I7 hasn't been Intels top chip in each generation for quite some time. Then you have to consider the different wattages of the laptops conpared (especially if you compare a MacBook Air which is more focused on efficiency) because the "regular" chips vary vastly in power target and thus performance. And then there are the desktop chips, which are a whole different story to begin with. And on top of all that come the asynchronous release cycles so when Apple releases somethin in June but this years competition's products haven't released yet, they are esentially comparing them to a year old technology.

Then there is the whole issue of selecting the software for the benchmarks. Not just the OS makes a difference but also the individual programs yoh select.

Don't get me wrong, I like the chips and I wish that more companies focus on more efficiency like apple did with the M1 chips (although I heard that it is a different story with the M2 chips now). But every company will select the test suites to be as much in their favour as possible and when you comoare the Mac Platform to Windows there is always that inherrent difference that programs are just not the same between the two.

7

danielv123 t1_it6gsi8 wrote

Yes, the apple chip won hands down in workloads they added hardware acceleration for like some video editing workflows. It doesn't make the CPU faster in general though. There is a reason why you haven't seen data centers full of m1 mac's like with the old PlayStations.

2

ChicagoThrowaway422 t1_it7foec wrote

The MHz battles of the 90s were insane. You'd have the fastest computer for maybe a month before someone else's parents bought one, then you'd have to wait three years before your parents could afford a new one, putting you briefly back at the top of the pile.

2

Sniffy4 t1_it6g1og wrote

SSDs have been a far bigger performance improvement in user experience over the last 10-12 years than any gains in CPU speed.

1

Apokolypze t1_it6jzv1 wrote

Macbooks aside, enthusiast PC hardware has been being pushed along by the massive gaming industry. 30- series GPUs from Nvidia were a massive generational leap forward from the 20- series. Both Intel and AMD are making big strides in the CPU space with 13th gen and zen4 respectively. Talking of RAM, DDR5 is finally actually here and in a big way.

Running a system (especially RAM) from 8+ years ago in the gaming space, while technically feasible, could not compare to the capabilities of a modern enthusiast system.

1

Halvus_I t1_it8p43f wrote

Direct Storage too. First on consoles, now PCs.

1

ReddFro t1_it42s06 wrote

Neat, but if you’re trying to say we’ve been saying this same shit for 20 years, I’ll point out from reviewing your own article list that older stuff talks about Moore’s law ending “soon”, and newer stuff saying its already dead. In fact several of the older articles even point to right around now as the end (one from 2011 said early 2020’s, another in 2013 that it’s dead in about 10 years).

8

Fantastic-Climate-84 t1_it44stu wrote

They all really have the same theme.

“New tech isn’t as much of an improvement over last years model”.

The point I’m making is this article comes out every single year, saying the same thing as last years article. Sometimes it’s from CEOs that want to slow down how much they invest in r&d, sometimes it’s from investors prophesying the end of the growth of IT stocks, sometimes it’s the end users themselves.

Quantifiably, moors law has literally ended. The law pertained to the actual count of transistors on a chip, and how small we can make them. The “law” stated we would be doubling the count every — we’ll, it was first every four years, but we outpaced that so he adjusted — two years. That’s not the defining element to how we use tech any more, and chips are so small and with rapidly changing design and dynamics that it’s still, functionally, in place. We haven’t stagnated on transistor count, it’s still going up, but with varying ways of building the chips into the systems tech has in no way plateaued.

10

ReddFro t1_it5cih7 wrote

The way my brother states it is the new moore’s law is the number of ways moors law is defined doubles every 18 months.

6

Fantastic-Climate-84 t1_it5j7kz wrote

Love it, that sounds pretty accurate.

Honestly I look forward to next years article for the same argument and the same conversations about the same topic.

2

MonkeyPawClause t1_it4uzii wrote

maybe they applied moores law to the pricing instead of the chips for a change of pace.

2

Yashugan00 t1_it72nfd wrote

I roll my eyes every time I see a tech journal write this comment every few years. been predicted since I graduated

2

kushal1509 t1_itd7v5l wrote

Tech journos: "well now since you're about to reach 3nm transistor size, Moore's law is going to die right?"

Scientist: "nope we will just stack the transistors over each other."

TJ: "No!! my "expert" predictions are wrong...again!!"😭😭

2

OptimisticSkeleton t1_it6lc3o wrote

Except that we really are approaching a place where any further reduction in transistor size creates electrical problems. This happens at 1nm and below, I believe. Companies are working on 2nm transistors right now.

1

___Price___ t1_it8g957 wrote

It’s been that way for years so they have altered chip design layout.

Even the idea of stacking or cube shaped models etc.

Mores law will be dead when we have 1nm chips laid out in the most efficient form with superconductors that runs in tandem with a quantum chip and artificial brain tissue, even then is that the actual limitation? Could anti matter silicon create faster computations by energy relativisticly running backwards in time. We are still working on theory’s on different states of matter, quantum loop theory is still pretty new, saying more law is dead is saying science has figured it out and manufacturing has caught up.

2

Benton_Tarentella t1_itlsw2i wrote

Well, no, Moore's law would not be dead only if a plateau was reached. The law is concerned with the rate of increase, so if progress slowed down (or sped up) significantly from doubling every two years, that would be the end of Moore's Law, regardless of whether it is theoretically possible to continue.

1

___Price___ t1_itltunn wrote

That’s making the assumption exponential growth would stop.

Exponential growth would only stop because economic limitations and a barrier of theory.

As of right now it’s nowhere near dead.

1

Rubiks443 t1_it3dome wrote

My friend is getting his PhD in chemical engineering and they are literally working on what will replace silicone. Theoretically if we made a chip out of this terazulene rather than silicone it would be more than one million times faster and more efficient. He is the second person in the world to create the molecule! However, the research takes time let alone the research to implement it into semiconductors, but it’s cool to see the future. https://barybingroup.ku.edu

67

wilcroft t1_it3gbif wrote

Just to FYI: Silicon = semiconductor metal, Silicone (with an “e”) is the rubber your spatula is made from.

Also, with no disrespect to your friend’s work, we already have many materials we could use over Silicon, but silicon is cheap, abundant, and relatively easy to refine. Many of the other materials aren’t, so even if they’re faster or more power efficient they’ll never see the light of day because they’re economically infeasible.

61

pinkfootthegoose t1_itigayl wrote

they will switch to silicone carbide transistors before this because the industry and tools are already in place for silicon.

1

nezeta t1_it3czrp wrote

Moore's law has been claimed to be "end" for the decade. It's actually becoming the same myth as "fossil fuels will run out".

In the latest process (3nm) the thinnest pitch is 22nm, so we'll still see several more generations from TSMC, Intel and Samsung.

31

Jaohni t1_it3jhvp wrote

Correct me if I'm wrong, but hasn't "nm" as a naming scheme been kind of misleading since 157nm immersion lithography failed?

Like, before then, nanometer was a measure of distance between transistors, and a smaller difference meant a faster calculation that also used less energy, and could be made cheaper because the transistors would require less silicon for the same calculation.

But as finfet started coming onto the scene, you could essentially raise the transistor in a third dimension, which adjusted the performance profile of that transistor, allowing you to gain "effective nanometer reduction", so things like TSMC 16nm and onward weren't really "nanometer" anymore, but an essentially abstract number that indicated roughly the performance compared to previous generations, which is also why intel 10, for instance, is roughly as dense, in terms of literal nanometers, as TSMC 6/7, but doesn't necessarily perform the same in all instances.

IMO Moore's Law, as originally described is dead (a doubling of transistors, and performance every 8 months), but the "Layman's Moore's Law", that "Computation will advance geometrically and we'll be able to acquire higher levels of performance for the same money", is still well alive.

There's plenty of interesting and technically challenging ways to improve performance, such as 3d stacking (IBM, AMD), disaggregation (AMD, Apple), heterogenous compute (Arm, Intel) and so on, without even going into the upcoming AI accelerators that will take advantage of improved multi-threading / parallel compute to shore up on our lack of raw single threaded improvements we've seen as of late, so as a tech enthusiast I'm absolutely hyped for upcoming products, but I don't quite think that it's quite right to say that Moore's Law is still alive as it was originally used.

19

danielv123 t1_it4qmd1 wrote

Yes, which is why the nodes now have other names but are colloquially grouped by nm. Tsmc for example have N4 which is just a variant of their 5nm process. They also have different suffixes which run slightly different settings on the same machines to optimize for clocks, power etc.

5

JehovasFinesse t1_it6njlu wrote

I’ve learned more in this comment than I probably would in a lecture

2

Immortal_Tuttle t1_it3igg1 wrote

We don't really can go much lower than that. Quantum tunnelling is a real issue at that size. However we still can go up with the layer count. However then we have the issues with cooling. So what is relatively easy with memory it's a big problem with computing.

17

Dyz_blade t1_it3h56c wrote

There’s also quantum computing processors which just took a step neared to mass production with a high rate of efficiency

5

old_adage t1_it3g648 wrote

Moore's law has ended for the definition of "computer power per dollar": https://en.wikipedia.org/wiki/FLOPS#Cost_of_computing

I work at a large service provider. Before 2016ish, compute costs were largely irrelevant since each hardware generation would make any investment in software or hardware optimization in the previous generation moot - resource consumption and compute costs were both increasing exponentially.

This is no longer the case: now we have major investments in software and workload-specific hardware to keep costs linear while handling exponential resource consumption.

3

djoncho t1_it6yrtx wrote

Sorry to be bearer of bad news but fossil fuels are finite and we aren't producing more. So they will run out if we keep using them. That's a weird use of the word myth at the very least.

On the other hand, Moore's law may or may not keep holding depending on technological advances

2

Zustrom t1_it78zsu wrote

I think they might have meant the alarmists meaning of fossil fuels running out as if there's only a few years worth left kinda thing.

1

refusered t1_it5kfro wrote

Moore’s law lol

Moore predicted exactly what’s happening so technically it’s still Moore’s law yeah

1

Mastasmoker t1_it7ou6h wrote

If you believe fossil fuels will never run out I got some news for you. There's nothing replacing them. It took the Earth millions of years to create what we have

1

blackraven36 t1_it3ht7v wrote

Moore’s law has been killed by bad headlines every week for the last decade.

Computing power isn’t entirely about how many registers you can put into a chip. Anything from memory bandwidth to multi-threading has a substantial effect on performance.

Focusing on just the size of the register is like saying “I doubled the displacement of my engine” You’ll see performance increase but that’s not the end all way to improving power.

20

howlinghobo t1_it3t4xg wrote

That's exactly what Moore's Law is though

>Moore's law is the observation that the number of transistors in a dense integrated circuit (IC) doubles about every two years.

16

hoehater t1_it3kyfj wrote

Does this mean lazy developers will finally need to learn how to write streamlined efficient code?

15

CPHfuturesstudies OP t1_it31flz wrote

Submission Statement:

Most of us are familiar with Moore’s Law. Gordon Moore, who first observed that computer capacity grew exponentially, was talking about the density of transistors in integrated circuits, but others (among them the famous futurist Ray Kurzweil) have later reformulated the ‘law’ to refer to the growth of processing power compared to cost – how many calculations per second a computer chip can perform for every USD 100 it costs.

This makes the law robust to changes in technology, extending it backwards to before the invention of the integrated and possibly forwards into the future when new technologies may replace the current siliconbased chip.

This article was first published in FARSIGHT - Futures Reviewed. A quarterly publication from Copenhagen Institute for Futures Studies.

7

cazzipropri t1_it4f84n wrote

I'm so tired of people not understanding Moore's Law, mostly intentionally.

I was hoping this article broke the trend. It didn't.

4

Reddituser45005 t1_it55s58 wrote

“Barring the most graphics-heavy computer games, a half-decade old computer today can easily handle most of the things it is required to do, whether at home or at work”

That is the essence of what is about to change. The desktop computer paradigm has been stagnant for a decade or more but that is about to change and it will require significantly more processing power. The current menu/icon driven approach we see in most desktop Office programs will be replaced by AI driven assistants that are generations ahead of the Siri model in use now. I predict the end of the “Office PC” is on the way sooner rather than later. The idea of embedded intelligence has been around for a while. I think we are seeing it emerge now

4

Cr4zko t1_it5e6yl wrote

I dunno. I had a computer with a 2006 processor (Intel Core 2 Duo) and I used it until February when it blew up.

6

Reddituser45005 t1_it5ff4v wrote

That’s exactly the point. Computers have gotten faster but the way we use them hasn’t changed. We are just running prettier and more graphic detailed versions of the software we used in 2006. That is about to change

2

Cr4zko t1_it5gd7i wrote

But no one will want those AI-powered computers simply because it would be unpractical and expensive. You'd have to retrain people to teach them how to even use it (remember MS Office 2007? I do!). The Desktop hasn't changed since 1995 for a reason. I might be wrong, though... still doesn't sound very accessible.

5

Reddituser45005 t1_it80rfg wrote

If it leads to greater efficiency, corporations will not hesitate to pay. As an example in my job as a project engineer I am preparing for a Site Acceptance Test on a multi million dollar upgrade to a pharmaceutical packaging and inspection line. That involves reams of documentation. User Requirements Specifications, Functional Design Specifications, Field Acceptance Testing, Site Acceptance Testing, Process SOPs, Technical and Users manuals and Validation requirements all linked together by a traceability matrix that shows how it all ties together. This is thousands of pages of documents that need to be summarized for different reports and departments at various stages of the process. There are templates and standards for each report they all require pulling information from different sources and referencing it all correctly. Having an AI that could automatically track all that information and populate documents according to the templates and standards, that would be huge. Most corporate document work involves interactions with other docs and drawings and spreadsheets. Automating even a small percentage of that is worth millions in labor savings ( and will involve layoffs across multiple industries)

1

PreservedLemonhead t1_it33cpl wrote

> Ultimately, we must ask how much processing power a typical user really needs. Barring the most graphics-heavy computer games...

Well that is part of the problem, isn't it? GPUs are far from photorealistic, and as the latest generation of NVIDIA cards show the only solution so far is more power, more heat, and more cooling to support it.

And if anybody truly cracks quantum computing, some of the limits we've grown accustomed to with traditional CPUs would suddenly disappear, and open up whole new frontiers in AI (supposedly).

I dunno...as long as things get hotter, heavier, and consume more power it would be nice to figure out how to cram more into the same physical and energy constraints.

3

Glodraph t1_it3bgdp wrote

Or they could just...change materials? IBM already build a graphene transistor. I think they could find way better materials to use which could bring higher performance with better efficiency.

2

[deleted] t1_it43d7t wrote

The issue is not the computer capacity, is the shit pile of stinky software everyone enjoys.

3

Fallacy_Spotted t1_it4vbc3 wrote

We have had overpowered processing for decades. That excess has led to the hot trash, effeciency wise, that software has become. If optimized it would be hundreds of times better. There are few exceptions to this like iterative processes at the core of the internet and things like it but other than that it is in a sad state. Corporations care little for revisiting "solved" problems.

3

ramriot t1_it5repy wrote

Strictly Moore's law ended a while back unless you allow fiddling with the factors. It was initially a doubling of transistor count every year, which held until 1975, then it was revised to every 2 years until around 2010 when it again started lagging.

2

drmcsinister t1_it5wu93 wrote

I worked at AMD around the 2005 timeframe making flash memory. We were at 65nm critical dimensions and my boss was wondering how we’d ever get smaller. He knew we would, but his point was that it was like a cross between a magic show and the sun rising: it’s going to happen, but nobody knows how. That was nearly 20 years ago, and we are still adhering to Moore’s Law.

2

FuturologyBot t1_it37387 wrote

The following submission statement was provided by /u/CPHfuturesstudies:


Submission Statement:

Most of us are familiar with Moore’s Law. Gordon Moore, who first observed that computer capacity grew exponentially, was talking about the density of transistors in integrated circuits, but others (among them the famous futurist Ray Kurzweil) have later reformulated the ‘law’ to refer to the growth of processing power compared to cost – how many calculations per second a computer chip can perform for every USD 100 it costs.

This makes the law robust to changes in technology, extending it backwards to before the invention of the integrated and possibly forwards into the future when new technologies may replace the current siliconbased chip.

This article was first published in FARSIGHT - Futures Reviewed. A quarterly publication from Copenhagen Institute for Futures Studies.


Please reply to OP's comment here: https://old.reddit.com/r/Futurology/comments/y91xtu/the_end_of_moores_law_silicon_computer_chips_are/it31flz/

1

samcrut t1_it3rzhv wrote

Moore's Law has been cancelled more times than Kanye and Trump combined.

1

PolyDudeNYC t1_it412pj wrote

Serious question. But is there a reason why chips couldn’t go three dimensional?

1

94746382926 t1_it4twon wrote

They are trying and memory chips are already 3D but as someone else mentioned heat is a big issue. I believe there is work being done on microchannels that run through the silicon itself to cool 3D processors, but not sure how far along that is.

2

Seeker_00860 t1_it4btvl wrote

With multi-core processors isn't performance being enhanced? Instead of shrinking size, the industry just has to move towards adding more cores and expanding the real estate for performance. Plus AI is the new thing now.

1

Pert02 t1_it4kc8d wrote

Increasing parallelism is not a straight and easy thing to implement, it really varies on depending on the application and how is the implementation of the software done.

Also increasing processor size can result in heating problems amongst others that need to be considered.

1

carlylewithay t1_it5auhw wrote

Moore’s law is the inverse of the flying car. The year we get a flying car Moore’s law will end.

1

bluefrostyAP t1_it6d3hz wrote

I’ve seen this headline about 500 times over the years

1

sir_duckingtale t1_it6xd4l wrote

There are those guys doing some form of tech demos on really old hardware..

Can‘t remember the name

It‘s not always about the hardware, but what you do with it

1

MrDinkh125 t1_it6xr3l wrote

What if the world uses quantum computers before 2030?

1

Mormegil1971 t1_it72gt8 wrote

Enter the quantum computers. Which we still will be using to watch porn.

1

Admirable_Ad1947 t1_it79lp9 wrote

The most advanced quantum computer has like 40 bits and needs to kept near absolute zero to work, we're a ways off from thaf

1

garchoo t1_it74g96 wrote

IMO there are still other optimizations possible that will advance us from a consumer perspective. For me the biggest jump in computing lately was changing from HD to SSD. My phone doesn't feel like it needs more processing, just more battery.

1

rdkilla t1_it7mwe9 wrote

we lost one aspect of moores law. we stopped making bigger silicon wafers. it means we lost one axis of cost reduction that used to be easy to tug on. it means we need more and bigger factories to keep up with demand, and there is some cost associated. we are at the dawn of 3d packaging. we are at the dawn of heterogeneous material composition (think processor made of different material chiplets (high frequency chiplets, high power chiplets, high efficiency chiplets, photonics etc..). we will achieve many orders of magnitude improvements in real computing power for a long time to come, but it won't always be cheap.

1

more_beans_mrtaggart t1_it3a1tm wrote

CISC chips have reached the end of the curve. Other chips such a RISC , not so much.

−5