You must log in or register to comment.

Denamic t1_jachsju wrote

They kept covid caused shortage prices post covid caused shortages. What did they expect?


qubedView t1_jadpzql wrote

Meanwhile post Etherium-Bomb the market has been saturated by rock-bottom priced second-hand high-end cards.


A-Delonix-Regia t1_jacexdx wrote

Makes sense, Nvidia has been ridiculous with their 40 series.


Lamacorn t1_jad3grs wrote

And market is flooded with resales right now.


RandomXDXDXDXXX t1_jacpkbc wrote

Revert prices back to non over inflated pricing and gamers will start buying GPUs again. I'm still on the 1080 Ti holding out till the 4060/4060 Ti drops for anything close to making sense to buy just for games. Even the rumored $499 price point for the 4060 Ti is making me hesitant to pull the trigger and continue waiting it out.


Wooden_Sherbert6884 t1_jactaf7 wrote

I also feel like nvidia is sort of pressured into putting higher prices on their models since these new cards are fucking beasts. Like if you have 4090 there is no reason to think you will not be able to run new releases at 1440p ultra 60+ fps especially considering dlss 3 exist. Even these "mid end gpu's" like 4060 will just destroy everything. That wasn't the case during 600, 700, 900 series. If you wanted to play at max settings you had to get 80 series but with anything else there were compromises going to be made, but now the cards are so powerful you don't even need the best and hottest stuff on market, so in the name of infinite growth they just keep pumping up the price.


shenrougu t1_jad1ukg wrote

Eh the exact same thing could have been said about the 10 series. 8 years later and the 1080 holds up really well.

No display port 2.1 and that they are stingy with VRAM doesn't mean the current gen is good long term tbh. I get you're saying the value proposition long term is good but I don't think so.


Odysseyan t1_jae7bpx wrote

You can also buy an electric flyable drone with a pilot seat and 30 mins of range. Faster than a car, can go anywhere - why not get one? Oh yeah, the fucking price is about 200.000 and a car gets the job done at the fraction of the cost and is better regulated.

What I'm saying is, Power is cool and all, but if it is not affordable and costs more than the rest of the PC combined, than who the fuck is gonna buy that? Should I pay my rent, or get this GPU? No wonder no one is gonna buy those cards


monchota t1_jacieap wrote

They are going to have to do some serious prices cuts.


theblitheringidiot t1_jackcfk wrote

Still too expensive, I’m good with 2k monitor and my 20 series card.


ppface t1_jadaf98 wrote

Imagine that. Money is getting tighter for people yet they keep raising prices by leaps and bounds. Maybe they should fire their executives since literally any jackass could see how their plan wasn't going to work.


SidewaysFancyPrance t1_jadgygk wrote

Unit shipments may have dropped, but based on pricing, I bet their revenues/profits aren't being hammered by 35%.


ppface t1_jadjdiu wrote

I don't think its sustainable. Okay fewer units, higher margin, same profits right? But a few years down the road, fewer units meant fewer customers, right?

Fewer customers for them also means fewer customers buying PC games.

Which means less incentive and funding to the publishers and developers of PC games.

Which means a reduction in the quality and/or quantity of PC games.

Which reduces the incentive to pay the bloated prices to stay in PC gaming.

Which means fewer units shipped.

I think that having a drastic reduction in GPU sales should be a cause for alarm for the entire PC gaming industry and beyond. Maybe its just because of the glut of used cards. Maybe its the prices. Probably both and other reasons all together.

I think that if the number of units shipped stays deflated, they're gonna have some major problems.


REPOST_STRANGLER_V2 t1_jaderj1 wrote

Profits are still up for both companies, it's all about pushing it to the limit and then reducing prices, harder to increase prices, easier to drop prices.


ppface t1_jadh0mi wrote

For now. I'll guarantee they lost market share. How many young people can afford their stuff? How many people have decided to just go with consoles that are much more affordable? I wouldn't be shocked if they've all but lost 20 something year olds. Between mining inflation, then covid shortages, then more mining inflation, the market has been unfriendly for close to 10 years.


REPOST_STRANGLER_V2 t1_jadj23v wrote

nVidia stock over the last year is up 65%, they seem to know what they're doing. The same thing was said when the 3000 series was massively over MSRP, didn't affect nVidia if anything it actually helped them get even bigger profits.

You say the market has been unfriendly for 10 years, in the last 5 years nVidia stock went up 300%, back in 2012 nVidia stock was around $3 now it's sitting at $236.90, they're doing fine.


ppface t1_jadkcql wrote

I was referring to the graphics card market for consumers, not nVidia's share value on the stock market.

I mean Tesla has a higher market cap than, Ford, Honda, BMW, GM, Daimler, VW, and Toyota COMBINED. So sorry if I don't exactly trust what tech bros are throwing money at.


DevoidHT t1_jad01el wrote

Prices always go up but never down no matter the situation


mdk2004 t1_jadexlj wrote

You can always save money on wages to offset your rising expenses.


Daedelous2k t1_jad8oui wrote

Funny that people are waiting for the cyptobased prices to come down, keep trying fuckos we aren't letting you anchor that new price.


DividedState t1_jadldrm wrote

I would buy one, but I don't accept the price. So, that is a selfmade problem.


ValuableYesterday466 t1_jadsecw wrote

GPU mining went away and gamers can't afford to pay the prices that the manufacturers have decided to start charging.


OrdyNZ t1_jaf1qun wrote

Or just refuse to because it's a ripoff.


skep-tic t1_jadwbaf wrote

Personally I'm waiting for the whole RTX fad to end (or the next gimmick I guess) before I invest in another card. I just don't see the need to upgrade when all I get is better lighting effects that make my games run slow.


littleMAS t1_jadiq77 wrote

Next will be for generative AI datacenters, we always find a use for these things.


JubalHarshaw23 t1_jae2dgw wrote

When the fake Shortages were still in full swing, and a top end card cost more than 2 month's salary.


RuairiSpain t1_jac8up5 wrote

Yeah, but given the explosion in AI PR that will change this year and next. Nvidia is about to have a windfall of GPU sales for ChatGPT like training


bamfalamfa t1_jac9pk6 wrote

nobody is going to be buying gpus for flimsy ai text training


A-Delonix-Regia t1_jacethn wrote

Nearly no one will do that. There are millions of gamers, and what? A few tens of thousands of people who are interested enough in AI content generation to buy a new GPU?


RuairiSpain t1_jacukww wrote

ChatGPT has 125 million vocabulary, to hold that in memory you'd need at least 1 80GB nVidia card, at $30,000 each. As AI models grow they'll need more RAM and Cloud is the cheapest way for companies to timeshare those prices.

It's not just training the models, it's also query the models that need that in memory calculations. I'm not expecting gamer to buy these cards. But scale up the number of using going to query OpenAI, Bing X ChatGPT or Google x Bard, and all the other AI competitors and there will be big demand for large RAM GPUs


nerd4code t1_jado5gs wrote

GPUs are in general way beyond overkill for NNs, which is what you’re talking about. NNs can use the massive data-parallelism and linear-algebraic trickery offered by GPUs, but the data format you use tends to hit a sweet spot right around 8-bit floating-point, and video cards tend to focus on 16+-bit, us. with the ability to do 32-/64-bit f.p. and 32-/64-bit integers also—units and busses for which will at the very least eat power. Newer NVidia cards do have TPUs attached so they can do 8-bit stuff without un- & re-packing, but that’s a comparatively tiny afterthought to the card’s design, and atl afaihs the TPU is usually shared between pairs of thread-XUs.

What you’d really want is to focus on, say, 32-bit integer add/sub/deref and 8-bit f.p. MACs in their own, non-shared units/lanes, and any special accel you can do for convolution will help some also. Which is why TPUs as a standalone thing exist.