Viewing a single comment thread. View all comments

ItsAConspiracy t1_it407ad wrote

Yeah GPUs are a bright spot. But partly it's because they're massively parallel and can just keep getting bigger and more power-hungry.

Another bright spot is neural chips, which aren't so much about Moore's law as getting better at specialized machine-learning architectures.

9

metekillot t1_it5nwvr wrote

Computer technology is only about a century old. I'm sure 100 years after they cast the first metal sword that they thought they were nearing the limits of metallurgy.

2

ItsAConspiracy t1_it78x1n wrote

We're definitely not nearing the limits of computation in general, just silicon chips specifically. We went from mechanical relays to vacuum tubes to silicon chips, now we need something else for the next big leap.

1

Cabana_bananza t1_itajcxw wrote

I think we will see the forerunners to computorium over the next twenty years. You have big companies like Element 6 (De Beers) working with others on creating better carbon semiconductors and researching use in computation.

The level at which they are manipulating the diamonds as they are developing has grown by leaps and bounds over the past 40 years. From the large x-ray diamond plates for satellites of the 80s to perfecting their ability to control and inlay imperfections onto the diamond structure of today.

Its starting to resemble what I think of when I pictured Kurzweil talking about computorium.

1