Viewing a single comment thread. View all comments

old_adage t1_it3g648 wrote

Moore's law has ended for the definition of "computer power per dollar":

I work at a large service provider. Before 2016ish, compute costs were largely irrelevant since each hardware generation would make any investment in software or hardware optimization in the previous generation moot - resource consumption and compute costs were both increasing exponentially.

This is no longer the case: now we have major investments in software and workload-specific hardware to keep costs linear while handling exponential resource consumption.