Qumeric

Qumeric t1_jeg5t31 wrote

Three years ago you could argue with *exactly* the same arguments that something like GPT-4 is impossible.

2

Qumeric t1_jefml1n wrote

I did not pick anything specifically, I just copied data from where I have seen it recently. How do I distort facts if I simply provide data without ANY interpretation?..

Okay, let's use 1950. Working hours per year in U.S reduced from 2000 to 1750, 12.5% reduction. Most developed countries did even better, for example, France (and it is not the best country in this aspect) moved from 2200 to 1500, 32% reduction. Germany is one of the best, they work 45% less than in 1950.

I do not deny productivity-pay gap, I dispute your claim "we always end up getting more productive and working the same amount or more". This is simply not true.

Although yes, we could work much less than now, we have enough technology to have 20h work weeks or even less.

0

Qumeric t1_jefcf5k wrote

26

Qumeric t1_jefc25g wrote

Reply to AI investment by Svitii

Usual reasons. Those companies already have very high market capitalization. They are leading now but can lose the lead later.

I do not say it is a bad idea but it is not necessarily an amazing idea.

1

Qumeric t1_jees0js wrote

This is not true.

According to Our World in Data, the average American worked 62 hours per week in 1870. By the year 2000, this had declined to 40.25 hours per week; a decrease of over 35%. As of July 2019, the average American employee on US private nonfarm payrolls worked 34.4 hours per week according to the U.S. Bureau of Labor Statistics.

0

Qumeric t1_je0gp4p wrote

No, 1% per year is not linear growth. X% growth per amount of time is a more-or-less definition of exponential growth.

Ask ChatGPT :)

I think what you described is formally also exponential growth for somewhat complicated mathematical reasons but only coincidentally.

Informally, you described the exponential growth of the rate of growth.

8

Qumeric t1_je071bs wrote

nitpick: people sometimes misunderstand exponential growth in the following way: they think exponential means extremely fast. Actually, it is not necessarily the case, for example, computer performance was growing exponentially for almost 100 years now and is still arguably growing exponentially.

answer in spirit: GPT-4 and Codex are making many people who work on technologies much more productive.

29

Qumeric t1_j0r1n3w wrote

Not so far. Reading pdf is probably not working great right now, but it works well enough for many cases and definitely will be improved.

I think the main problem right now is that LLM's memory is short, so to actually learn a full textbook, it has to be fine-tuned on it. It is inconvenient and expensive, but I am pretty sure it is possible to make it much better.

I would say we will see something like this in 3 years or less.

5

Qumeric t1_iwhbhef wrote

First, the article is pretty bad, doesn't seem like high-quality journalism.
Second, there are different ways of calculating FLOPS. It depends on the kind of numbers (8-bit, 16 bit etc.) and on the benchmark. Frontier (top-1 supercomputer) has 7.5 exaflops on HPL-MxP (mixed precision) benchmark, and Google has 9 exaflops for AI tasks (probably 16 bits?) cluster.

9