Viewing a single comment thread. View all comments

UniversalMomentum t1_j9l3r1e wrote

Qantum sounds useful for some stuff, but realistically silicon can do so much and will still improve. The limitations right now are clearly programming, not really chips.

People have this way of thinking MORE is always better/useful, but it's not. The easiest thing that gets the job done is the most useful. The simplest design that does the job is better than the complex design that does more than you need. Getting that through to most people is hard, getting it through to a bunch of future tech fans is even harder.

Path of least resistance is the truly proven strategy and that also means path of least complexity. It's kind of like simplifying a math problem is the more premium version of logic than leaving it as complex as possible, but with engineering and cost of operation.

−1

DeepState_Secretary t1_j9m7cnh wrote

That’s not the point of quantum computers.

Its not about making faster classical computers. But rather that quantum computers could potentially solve problems and do things that classical computers cannot practically do irregardless of how good they are.

4

Fallacy_Spotted t1_j9mfab1 wrote

To be honest the better hardware has enabled worse software to an astounding degree. So much of it is a hot mess compared to the truly important stuff like bios, switch, and compiler code.

3

SnapcasterWizard t1_j9nh1hr wrote

Hey, what do you mean I don't need to ship an entire browser-stack just so my chat application can render shit with javascript?!!?

2

Literature-South t1_j9lfxq2 wrote

We're at the point with chips that they're so small that we're running into issues with Quantum tunneling causing errors. Silicon for chips is really at its limit. Moore's law has slowed considerably because of this.

1

Deadboy00 t1_j9lh8vl wrote

True. But engineers have come up with some clever ways to get around it and still offer performance gains.

Quantum computing is for problems that don’t have a clear solution. Classical computing isn’t going anywhere even as we look far into the future.

3

MINIMAN10001 t1_j9mwubg wrote

I mean the whole point of "Moore's law is dead" was that... Moore's law, is in fact dead. It wasn't the end of scaling, but the end of the self fulfilling prophecy which they targeted as the rate of scaling for decades has run its course.

It's not the end of transistor scaling, but instead the end of Moore's law, the golden age has come to a close and odds are the respective companies have already been working years at what they consider to be the solution going forwards.

AMD is looking to stack compute with memory. Nvidia looking into AI based image scaling.

2

mannaman15 t1_j9n912q wrote

Happy cake day! Also our user names are closer than any other I’ve seen on here. 🍻

2