wen_mars

wen_mars t1_iv8r7f8 wrote

> Guessing mostly, > > If you take a bunch of computers that is <1 yo and the best os & software you can find. The software choices often work fine, but are actually not so optimised. Sometimes they are brutal in their resource requirements. > > Then you take a bunch of computers >5 yo. And you install the best os & software you can find. The software choices apply many code optimisations that actually take substantial advantage of the full set of hardware features. > > It’s another reason why old hardware is amazing and always worth keeping, repairing and maintaining and even, actively using privately, professionally or commercially.

This is not true. The actual reasons why old hardware works just fine are that CPUs have not improved all that much in single-threaded performance over the last decade or so and RAM does not meaningfully impact performance unless you have too little of it. The only big change has been the transition from HDDs to SSDs. Loading times and boot times have improved a lot because of it.

CPUs now have more cores than before but most software does not take advantage of it.

5

wen_mars t1_iv4savp wrote

> The reference prices for RTX 3090 and RTX 4090 are $1400 and $1599, respectively.

Use realistic prices and the results look very different.

> Depending on the model, its TF32 training throughput is between 1.3x to 1.9x higher than RTX 3090. > Similarly, RTX 4090's FP16 training throughput is between 1.3x to 1.8x higher than RTX 3090.

8

wen_mars t1_itz0zey wrote

Without breaking our current understanding of the laws of physics: They haven't figured out time travel so they run a simulation of Earth to predict the future and find solutions to problems before they happen and simulate the consequences of those solutions.

4

wen_mars t1_irs1e4o wrote

I think AI will get good enough within a decade or two but it will take another few decades for the demographics to shift. People who already are in good relationships will likely prefer to stay in those relationships and some people will prefer a human despite AI being better and more available.

1

wen_mars t1_irk6ae6 wrote

AI and a graphics tablet aren't mutually exclusive. You can sketch with the tablet and add as much detail as you want, and then let the AI do the rest.

You're putting a lot of words into my mouth but I'll address your last two paragraphs. AI's ability to follow directions has improved tremendously over the past several years. I think it will continue to improve and get close to AGI-level performance on a wide range of tasks this decade. For actual AGI my guess is next decade.

1

wen_mars t1_irgoqj3 wrote

I think there are a lot of people whose income does not increase fast enough to keep up with their increased cost of living, especially now with inflation so high and Powell having declared war on employment. From their perspective it can look as if the middle class is being erased. I don't believe that generalization is valid but it will be interesting to see what happens to jobs when AGI makes humans obsolete.

3

wen_mars t1_ire9yvm wrote

We don't fully understand how biological neurons work. Mapping the physical layout of a brain doesn't tell us how it works. Another big limiting factor in AI performance is training data and evaluating task performance. We don't have a simulation environment that accurately replicates the life of a worm and we don't have millions of years of accumulated training data simulating evolution.

5