Viewing a single comment thread. View all comments

routerg0d t1_j29ebzz wrote

Problem is at around 10nm chips quit getting more efficient power wise. Below that the amount of power to drive the chip began rising quickly. A 3nm chip will need more power than a 5nm chip. Apple saw this and moved to RISC based chips because RISC requires less power on the same sized chip process.

GPUs are really hitting a wall because of this as well.

2

vorpal_potato t1_j2erw46 wrote

TSMC is claiming that their 3nm process uses 20-30% less power than a comparable chip made with their 5 nm process, and they've got a version bump in progress which they say will use only half as much power as a 5 nm chip with equal performance. Pretty consistently the sub-10nm processes have delivered better power efficiency in practice, which is the opposite of what you said. (Maybe you meant to make some more nuanced point about e.g. dynamic current vs. leakage current?)

Also, that's not even remotely why Apple switched from Intel chips to their own. They switched because Apple had managed to beat Intel at microarchitecture, the way the chip works inside. The instruction set doesn't really make much difference except in the instruction decoding portion, which isn't that big a part of the chip these days. The RISC/CISC distinction used to be real, but now it's as outdated as falconry and fax machines.

2