Comments

You must log in or register to comment.

3G6A5W338E t1_iwbe3g2 wrote

PSU requirements, case length requirements, cooling requirements, wall power socket requirements, electricity costs and fire hazard status.

Nevermind price.

4090 is not worth the hassle.

127

booga_booga_partyguy t1_iwbj859 wrote

Ya know, this reminds me of the early-mid 90s when you had to upgrade your PC to play the latest game because your current one couldn't run it.

Replace "game" with "NVIDIA GPU" and you've more or less got the same thing all over again!

36

depressedbee t1_iwcdmeo wrote

>PSU requirements, case length requirements, cooling requirements, wall power socket requirements, electricity costs and fire hazard status.

Nvidia could've asked for collateral instead.

10

Shaunvfx t1_iwcfx6w wrote

Love mine, no issues. Does wonders for VR simulation and pegging everything else at 144hz at 1440p.

6

gymbeaux2 t1_iwd4zgo wrote

How dare you have your own experiences and opinions!

3

Shaunvfx t1_iwd8qqj wrote

Seriously, I mean I guess I shouldn’t be surprised at the downvotes. It doesn’t impact my 4090FE though haha.

1

MasterOfTheChickens t1_iwdasia wrote

Same experience here on a 4090 Strix. Works fine, does VR like a champ (my use-case) and fits my Obsidian 7000D well. My main complaint will be that it isn't an EVGA card.

4

skinnyzaz t1_iwe9a78 wrote

Yeah same I have 2, the MSI suprim liquid and the gigabyte OC gaming. No issues with either and love the 4K FPS and VR experience. Couldn’t imagine going with anything slower at this point. I even have the gigabyte card inside a meshroom S case with no issues.

0

Shaunvfx t1_iwebpp6 wrote

Crazy surprised it fits in the meshroom

0

RZ_1911 t1_iwboag4 wrote

Lame excuses again . They reduced the contact area compared to the 150w 8 pin connector. At the same time they increased the power transfer. And started wonder why the contacts melted?

50

RocketTaco t1_iwcea74 wrote

Mini-Fit is so insanely overspecified for the requirements of a PCIe unit load (which is why PSU manufacturers can offer Y-cables that use the same physical connector for two unit loads without issue) that it could have worked. The problem is they went entirely the other way and ran the Micro-Fit connector as close to the edge as they possibly could, then sold people adapters that introduce design elements not considered when Molex designed the thing. What they should have done is just shrink the PCIe 8p connector to Micro-Fit and take the opportunity to repurpose the wasted main pins from compatibility with the 6p, so that it would have an actual 4+4 power pins. You'd get a 30% reduction in size from the switch from 4.2mm to 3.0mm pitch and a 30% increase in power from using the fourth pin, for a total of 80% increase in power density and still tons of headroom for unknowns. At that point, the area occupied by the connectors would be no issue at all. But they decided to go for that 270% instead, and here we are.

9

RZ_1911 t1_iwcfii5 wrote

It’s not THAT MUCH overspecified .. stable limit is somewhere 75w per rail (210w per 8p rail ) . With 100w (300w per rail ) even old 8p melted . Now imagine - they shrinked contact area for 1/2 of old 8pin connector.. and said - 100w per rail and 600w per connector- guaranteed 🤣

2

RocketTaco t1_iwcid9p wrote

Mini-Fit does not melt at 100W unless significantly damaged. Molex's own spec for eight circuits onto a PCB is 10A per circuit at 16AWG (120W) for 30C temperature rise over ambient. You could run 360W on 3+3 pins before you even brush up against the safety margins which, again, are given with a third more heat since they assume all eight circuits are carrying load. Conversely, I've seen cards with two 8p connections burn one up at under 200W loading, because the receptacle was bent and didn't make proper contact.

3

t0s1s t1_iwdbbik wrote

I’m reminded of my year or so of running KnC Miner Neptunes back around 2015. Similarly designed to run too close to the design specification for a 4-pin peripheral plug and socket, they caused several fires, or destroyed the miner or the PSU by melted connectors.

Even running relatively high end Corsair AX1200 supplies didn’t guarantee me a stable run, and at least one set of cables was destroyed that year.

I don’t ever want to run it like that again - the 4090 likely won’t be for me, even if it fits my case.

1

HomeIsElsweyr t1_iwd12cq wrote

I saw one test where they put 1500 watts through them, they dont melt

2

R011_5af3_yeah t1_iwbwie4 wrote

3080 an 3090 gpus went up in price after 4090 launch. They were trending down for months. Thats how bad it is.

32

diacewrb t1_iwcbv6a wrote

Nvidia's long con has finally be revealed.

14

chrisdh79 t1_iwc70xs wrote

From Tom's Hardware: Gabriele Gorla, Director of Engineering at Nvidia, told Igor's Lab that Nvidia buys its 4-to-1 12VHPWR (four eight pins to one 12-pin) power adapters from two companies: Astron and NTK. While both adapters are designed up to the specifications defined by the PCI-SIG standards body, they are still quite a bit different 'inside' as they use slightly different contacts. Astron apparently uses double-slot spring contacts, whereas NTK sticks to a long single-slot spring contact that has lower resistance and it is easier to detach.

According to Igor's Lab, Zotac and Gigabyte have said the adapter from NTK is less prone to failure even after multiple mating cycles. Astron argues, according to Nvidia, that its adapter performs in accordance with specifications (i.e., its resistance is below 1.5 mOhm). Meanwhile, as Igor's Lab points out, Astron's 12VHPWR adapter has its thick 14AWG wires rigidly soldered to 2mm^2 soldered pads, which is a point of failure, especially for contacts on the edges of the adapter.

While Igor's Lab details how to distinguish between adapters from Astron and NTK, it is impossible to tell which of them will ship in a particular box with a GeForce RTX 4090 inside. Meanwhile, the report says that Nvidia will keep using 3-to-1 12VHPWR adapters from Astron and NTK supplied with GeForce RTX 4080, but will only use 2-to-1 12VHPWR adapters from NTK with GeForce RTX 4070 Ti boards.

18

7oom t1_iwbvu9v wrote

r/unexpectedmagiceye

2

AlltheCopics t1_iwcd5pl wrote

glad you are living life to the fullest

0

Leo_Heart t1_iwbjq8t wrote

What the hell am I even supposed to play with this thing? There are no games that even come close. Yes I realize I can play 4K 144hz or whatever but 4K is a meme for pc. You’re sitting too close to the screen to utilize it well.

−3

kyuubixchidori t1_iwbx71k wrote

I have my pc hooked up to a 55 inch 4K tv. if I’m not using it for work it’s treated as a kickass Xbox experience. And with monitors getting bigger and bigger I definitely think 4k is definitely relevant to pc games.

That being said there’s still reason to spend over $800 or so on a gpu right now and if you do, it’s because you have extra disposable income not because you need it

8

qtx t1_iwg6nvf wrote

Sitting at arms length from a 55inch tv isn't good. You might not notice it right now but you'll def feel it later on and wish you never did it.

−1

kyuubixchidori t1_iwg7d4k wrote

I’m probably 8 feet from the screen. I should of said that but that’s what I meant by “Xbox experience” I have a desk that I use with a ultra wide, but 90% of my gaming is done on the couch.

1

farmertrue t1_iwbyhp8 wrote

The only thing that comes to mind as far as gaming is PCVR. Games like No Mans Sky and MSFS struggle hard with even the 3090 Ti. The 4090 still can’t max graphics or resolution but it is a huge difference. Plus other PCVR titles are now able to play at 120hz refresh and/or high resolutions.

With the next gen PCVR headsets already here as well, and many many more around the corner, some will struggle to be used with even a 3080 on the lowest settings.

For desktop gaming though, very very few games.

But hopefully people are not buying the 4090 for just gaming. Hopefully they are purchasing for features such as the dual multiprocessor encoder, for the increase in energy effectiveness, for niche DLSS 3.0, using the huge increase in cuda for scientific research and purposes, for video editing/recording, for AV1, perhaps live streaming at high resolution, or a mix of all of the above with gaming.

4

Stock-Freedom t1_iwbuvx9 wrote

I disagree. I use a 4K LG OLED 48” at just a couple of feet as a desktop monitor (used primarily for racing sims). It is absolutely apparent that the increase in resolution and card power makes for a better gaming experience. 4K 120 FPS gaming requires a beefy rig.

And the situation is even more demanding with VR titles.

3

oMadRyan t1_iwcbclx wrote

Odyssey G9 users disagree with you :) 5120x1440p is roughly 4K in pixel count

3

Bozzz1 t1_iwcdxpx wrote

2k and 4k look quite a bit different on an 80 inch TV

3

diacewrb t1_iwcc47a wrote

> What the hell am I even supposed to play with this thing?

Fireplace simulator, now with extra heat and possibly flames as well for authenticity.

1

Akrymir t1_iwcz71l wrote

That’s not even remotely true. You can be as far as 3.5 feet from a 27in monitor for 4K. The resolution becomes useless when going to 8k. The real argument is that graphically games don’t come close to saturating 4K resolution, so the gains from 1440 to 4K are not as significant as they should be.

1

HomeIsElsweyr t1_iwd3p1d wrote

On a 27 inch i can pick out pixels at 70-90cm depending on the image at 1080p, i can definetly tell if its uhd or qhd

1

Samsbase t1_iwbl1fq wrote

Whilst I understand the sentiment. 4K at a normal sitting difference really does make a difference over even 1440p. You'd probably notice the difference even at 8K+ . Now a lot will depend on what size you are spreading that 4K over. Put that on a massive tv and yeah you're probably right. But cramming 4x more pixels into a normal 27-32" pc monitor and quadrupling the DPI will be noticeably better.

Now actually being able to drive a 4k monitor at 144Hz + you pretty much max out (or exceed) most cables which is kinda funny. You really have to choose your monitor and cables quite carefully to do any more than 4K 60hz at the moment.

0

juh4z t1_iwbsoee wrote

>You'd probably notice the difference even at 8K+

No.

Also, the bigger the screen, the more difference a higher resolution makes lmao

7

Hezkezl t1_iwbumkd wrote

I’ve got a 12700K, 32 gigs of DDR5 RAM, a 3090 TI, and a 32 inch Samsung Odyssey Neo G7 monitor that can do 4K at 165 Hz.

The only thing that lets me tell the difference between 1440p and 4k is how high my FPS counter is in games. Beyond that, sitting at a normal distance away from the monitor it’s completely impossible for me to see the difference between 4k and 1440p. Now, at 1080p? Yeah, I can see a difference… But with 1440p at 32”? Nope. Identical.

Unless I scoot forward to be about 2 feet away from the monitor and I do a quick comparison as I switch between resolutions, then I can maybe tell a little bit of something… But with a moving image aka a game that’s being actively played? There is no way.

There’s been numerous tests by people to see if they can tell the difference, and it’s pretty much a crapshoot. Most recent one I can think of is LTT with an 8K monitor trying to see if their staff can tell if it’s native 8K or 4K that’s being upscaled to 8K, and the majority of them had no idea which was which even after being told which one they were seeing. The only way that some of them could slightly tell was with the obviously lower FPS.

Don’t confuse 1080p with 8K, which seems to be what your post is getting at. A 1080p image upscaled to 8K on a native 8K monitor would definitely look different than an 8K image… But that doesn’t mean 8K is worth it or is something to aim for or anything else. 1440p is perfectly fine.

EDIT: I'm 20", 27", and 45" away from my monitor depending on if I'm leaning forward, sitting normally, or leaning back.

6

joomla00 t1_iwbw7kz wrote

How far away are you sitting? I have a 27" 4k monitor about 3' away. I can def tell the difference going from 1080-1440-4k. It could be a non native rez scaling issue. I was hoping there wouldn't be a diff so I can get more fps, but it always looked a little bit off at 1440, and crisp at 4k. But this was with a slower rpgish type game so a lot of still background, talking heads.

4

Hezkezl t1_iwbweuy wrote

It should be about 3 feet away. I’ll measure it in about eight hours or so after I wake up, just turned my computer off for the night and I’m on mobile right now.

1

joomla00 t1_iwbz0nf wrote

No worries mate

2

Hezkezl t1_iwc6hdm wrote

Went and measured just now, I'm about 27 inches away from my screen sitting normally, 45 inches when leaning back and about 20 inches leaning in.

I've legitimately never noticed a difference in any of those positions. Maybe I need better glasses if you swear you can tell a difference on your 27" one from about 36 inches away? I’m only in my 30s so if you were significantly younger than that or if you don’t need glasses and that might be why lol

3

User9705 t1_iwblghv wrote

I actually do that now with a 3080ti

0

3G6A5W338E t1_iwbqv82 wrote

Eventually, it'll make a perceivable difference in some game.

But, by then, there might be better yet cheaper hardware available.

0