Comments

You must log in or register to comment.

TheDankerGod t1_izxccnh wrote

I couldn’t give less of a shit about ray tracing so this is great news to me, AMD stepping up in the game

98

makedesign t1_izxu24n wrote

Layman here - what am I giving up with Ray tracing?

18

Neoptolemus85 t1_izym4g4 wrote

Ray tracing is used to give physically-accurate real-time lighting and reflections by simulating the bouncing of light through an environment. Without ray tracing, games have to fake things like sunlight illuminating a room through a window, or mirrors, puddles and other reflective surfaces.

The thing is, games have become really good at faking those things, so for a lot of people the difference is only noticeable when viewing side-by-side comparisons, and not really when actually playing in-game.

Ray tracing could be much more than just better lighting and reflections, but it's still a niche capability for enthusiasts with high-end gear. We won't really see the full potential of ray tracing until it becomes standard for the majority of gamers which is years away.

31

BIGSTANKDICKDADDY t1_izzeqhm wrote

>The thing is, games have become really good at faking those things, so for a lot of people the difference is only noticeable when viewing side-by-side comparisons, and not really when actually playing in-game.

In some side by side comparisons you may not notice any difference or even an advantage to the "non-RT" image. Offline baking allows us to perform extremely high quality path traced lighting and shadowing, taking hours and hours to illuminate a scene, then store that result on disk and load it back in when the game is played. The downside is that all of the geometry we use to perform those calculations must remain static! Because you aren't able to perform those calculations at runtime you can't allow the player to modify the scene and break the lighting/shadowing you baked into it. Modern processors have made complex physical interaction very achievable but utilizing offline lighting techniques means you can't make wide-scale use of them for interactivity.

Real-time ray tracing is a massive boon, not just to visual fidelity, but to interactivity in game environments going forward. It also alleviate a lot of manual effort we spend faking the lighting in environments to look as if we did have RT available. It will be interesting when we see the first game that doesn't offer a "non-RT" version because it was built from the ground up using RT and didn't incorporate any older workflows and techniques.

10

HugeHans t1_j01ma66 wrote

I think ray tracing is fantastic but I also think that realistic doesnt always mean better in this context. Like if I was trying to sell my home and took pictures for the ad with a really good camera the pictures would be very realistic but someone could make my home look much better in photoshop by tweaking the colors and contrast. One is realistic, one is what people like.

2

BIGSTANKDICKDADDY t1_j01nrnh wrote

Ray tracing isn't necessarily about creating "realistic" scenes. It's about creating a realistic model for the behavior of light. It doesn't preclude color grading or any other stylistic flair you want to inject into a final render. Fortnite is a cartoony game that looks pretty great with raytracing. Epic's Matrix demo adopts the same "cold blue" aesthetic from the films. Disney/Pixar use path tracing in their 3D works but nobody would say "Up" or "Frozen" look "realistic", you know?

1

werfenaway t1_izzqiww wrote

Ray tracing saves game development time by sparing developers having to do all the tricks to get it looking comparably good.

2

alc4pwned t1_j02eady wrote

Idk, Cyberpunk with raytracing on vs off is a big difference.

1

panzerfan t1_izxudzz wrote

About 1/3 less frames, increased overhead. Pretty stiff penalty even now.

16

jordoneus121 t1_izy478n wrote

Maybe on an amd gpu. It's typically around a 40% loss on an nvidia card. Ray tracing done well is gorgeous and absolutely worth the cost imo.

6

CazRaX t1_izy6tu2 wrote

Most people play at 1080p or 1440p, ray tracing is nice but most won't see the difference or care.

−8

BarackaFlockaFlame t1_izy92od wrote

i 100% see the difference of ray tracing at 1080p. Spider-man is incredible looking with raytracing on, seeing the real-time reflection on glass windows as you swing by and then distorting the way they would on a glass material. the lighting is also so much more realistic in how it effects things around it. it was nice for a bit, but then i want higher frames for the combat so I turn it off. Would love to have it always on but my PC can't reliably do over 60fps with raytracing at the quality settings I'd want.

5

coworker t1_izyefjv wrote

I agree with the other guy. I have a 3080 and play 1440p on a G9. I couldn't tell a sizable difference in CyberPunk with RT on or off.

−1

BarackaFlockaFlame t1_izyfwvu wrote

cyberpunk has dog shit ray tracing to be fair. that game looked the same to me as well, except with ray tracing off it performed much better. i'm not arguing that not having raytracing is dumb, im just saying that on games that utilize ray tracing well, you definitely can notice in 1080p, there just aren't many games that use it effectively. try out quake 2 rtx on steam. it is free and shows you what ray tracing should look like. they also just released a free update to portal that gave it a graphical overhaul to also use ray tracing. id recommend giving them a go if you want to actually see some good uses of it, and even at 1080p its crystal clear that there is a difference in how light interacts with the game world.

1

panzerfan t1_izy4txr wrote

I am somewhat on the fence when it comes to this. If people have OLED or QLED, then yeah, go for the raytracing eye candy. The problem to me is that it is a tiny minority.

−10

KeyWerewolf5 t1_izye8la wrote

What do oled or qled have to do with raytracing?

5

panzerfan t1_izyegub wrote

I don't see why you would really embrace raytracing if your monitor can't handle the dynamic contrast.

−6

durielvs t1_izyh5xu wrote

Ray tracing is not the same as hdr. Rt is used to be able to see reflections in real time on water, for example, and much more complex lighting in general, even though it does not have a great dynamic range.It has much more to do with the physics of light than with pretty colors.

7

KeyWerewolf5 t1_izyoynu wrote

But raytracing isn't about that. Its about more accurate reflections of light which lead to more realistic mirrors/bounce lighting/etc. Your resolution or hdr/dynamic contrast/whatever will certainly help a bit, but no, the benefits of raytracing are apparent across all screens. Watch every toy story on the same screen and tell me the lighting doesn't look better each sequel. Same thing.

2

BobisaMiner t1_j0h9m3d wrote

This is not even remotely true. Sure they'll look much better on an oled (qled is not in the same league.. and it's mostly made up by samsung) but raytracing looks good on all panel types.

It's cutting edge tech for pc gaming, of course it's going to be for a small group.

1

SatanLifeProTips t1_izymoiq wrote

I would say that. Then I enabled ray tracing and wow. I’d happily trade a few FPS for some beautiful liquid eye candy. Having light behave like real light is amazing. Shadows, reflections, god rays. It’s beautiful.

Especially on a 65” OLED.

6

ArgonTheEvil t1_izzd8f7 wrote

I saw what Cyberpunk looked like fully RT’d out and I’ve been putting off playing the game until I can experience it in its full glory at a good frame rate. Then there’s games like Dying Light 2 where it goes from looking exactly like the first game without ray tracing, to a hell of a next gen game with it.

My 3070 just can’t hack it at 3440 x 1440 with all the RT features turned on, and the 7900XTX only seems to be at 3090-3090Ti levels with RT.

The 4080 is about double the performance of my 3070, but given where the 4090 stands, the most I’d pay is $900. So I guess I’m waiting til then or next gen, and seeing if AMD does better next round.

5

SatanLifeProTips t1_izzkawd wrote

Cyberpunk wasn’t the best example, but it ran great most of the time on my 3070oc at 4k. Got a good CPU to match? I’m running a 5800x and it seemed fine.

I do have the factory overclocked version (no overclocking enabled besides the factory setup) so maybe something inside ran better? I know it has quicker memory. I also built the mother of all cooling systems with 3d printed ducting feeding the CPU and GPU so I was always under 60C when under full stress. That helped keep it at full clock speed.

There were certainly times when cyberpunk ran like dog shit but I chalked that up to bad coding. The game itself was hit and miss at best for quality.

1

ArgonTheEvil t1_izzp2fl wrote

I’m using a 5800X3D and while my 3070 isn’t overclocked, it’s running perfectly cool even under full load, so it’s boosting to its max within stock power limits.

But I’m aiming for RT turned on and all the way up, 60fps 1% lows, and DLSS at no lower than quality setting. I don’t know what qualifies as “great” for you but my 3070 wasn’t delivering my version of great last time I tried it in Cyberpunk. Most other games it’s plenty or more than enough, but I’ll hold off playing a game like Cyberpunk until I can do it with all the eye candy.

1

rosesandtherest t1_j007f8a wrote

lol you should see comments on any other sub or tech site, the gpu is doa trash

1

blucasa t1_izycjz4 wrote

Could not agree more, shit is sooo blown out of proportion its not even funny.

0

TunaOnWytNoCrust t1_izxihcu wrote

Ray tracing is rad but if it fucks with frames it's getting turned off, and whoop there goes Nvidia's best selling point.

42

captain_nibble_bits t1_izxntu6 wrote

It reminds me when anti-aliasing came out. That just massacred frames when it came out people where calling the same thing. Not worth the trouble. It did become standard. Same thing with RT. Give it time.

33

PicnicBasketPirate t1_izxost6 wrote

Kinda amusing that anti aliasing isn't really worth the performance hit has you get to larger and larger resolutions

19

CazRaX t1_izy6zvv wrote

Well, yeah, it is obvious that once it no longer affects performance it will become more used. As of right now and for the near future RT will not be worth it unless you sink massive money into top end parts.

6

TunaOnWytNoCrust t1_j0025r2 wrote

I will give it time, but I've decided that it's not worth my money until it's acceptable. Also I usually turn off anti aliasing because it also hurts frames and takes hardware resources that could ironically be used to prevent aliasing.

1

Sevinki t1_j01g9kc wrote

I mean that just depends. For singleplayer games i turn it on if i can still stay above 60 fps because RT does look amazing. For multiplayer its off and most settings are medium or low for max fps.

1

Scruffy42 t1_izxborb wrote

RT on... Is it on? Wait. No. Yes. I can't tell.

38

Jason-Bjorn t1_izxxvt2 wrote

Some games it’s very easy to tell like cyberpunk, because of the sheer amount of glass and lights. But yeah if it’s not an environment like that it’s definitely harder to tell, although I find it makes people’s faces look way more real

15

Scruffy42 t1_izxz1z0 wrote

I was joking above, but in Cyberpunk there was a bug where it might not apply video settings even after restart when I tried. So I wasn't actually sure. :-D Actually, Hitman 3 is where I see it most.

7

Daveed13 t1_izyqajg wrote

RT is what really make Pixar movies looks way better than video games.

It really noticeable, and, IMO the latest hurdle to better looking games.

With that said, games that are using RT nowadays are using it for only PARTS of the visuals.

When games will be able to use it for all effects at the same time (reflections, all light rays on all surfaces, all shadows etc) then you WILL notice it.

It will also ease the work for devs! :)

1

Megane_Senpai t1_j006nxm wrote

I am both a game developer and a gamer now and I'd say at least for now I'd happily put on some more work to optimize our game for a wider audience and more range of devices, simply for the profits.

However I too hope one day RT can become a standard feature on mid-range card and above with hardware advancements that make it cost less performance than now (around 20-30% fps instead of 60-80%).

1

uniquely_ad t1_izxbx60 wrote

Graphic cards pricing is insane now, hope this will ease the competition.

28

marbles61 t1_izx9sk4 wrote

Hoping this brings NVIDIA cards down as well…they need the competition.

22

FirstSnowz t1_izx8w4f wrote

Was waiting to pull the trigger on one of these until I heard what the verge thought about it - now I know I can buy safely!

18

Dark_Clark t1_izxh1t1 wrote

The impact of ray-tracing is completely overblown for nearly all games.

17

RechargedFrenchman t1_izyd1ga wrote

The visual fidelity impact**

The absolutely tanking your framerate impact is pretty hard to undersell.

9

alc4pwned t1_j02feuy wrote

..because most games don’t support it. In the ones that do, it’s usually pretty impressive. Raytracing looks really nice in Cyberpunk. Elden Ring is supposed to be getting an RT update soon, I bet that’ll look amazing.

2

Dark_Clark t1_j02s6dt wrote

It looks good in Cyberpunk, but that’s like the very very top of the heap. And even in a game like Metro Exodus, if you lower the ray tracing settings, the difference is pretty much negligible. Control is supposed to be this incredible showcase of ray tracing but it just doesn’t look that good. Looks better? If you A/B them, yeah. But compared to an upgrade in resolution or frame rate, it’s just not really that important.

1

ben1481 t1_j02tfgy wrote

What exactly are you expecting? It adds more realistic lighting and reflections. Do you think it's going to make things photorealistic overnight? Graphics have always evolved in steps, and this is yet another step.

1

Dark_Clark t1_j03muh4 wrote

I agree with you. This is the future. However, at this point, the cost is immense and the benefit it very small. That’s what I’m saying. Not that ray tracing is bad, but that it’s just not that big of a deal at this point given its cost.

1

T-Wrex_13 t1_izxm3n4 wrote

Most of the games I play have very little ray tracing, and instead care more about rasterization anyway. While this will likely change in the future, it's not likely to change this generation. I'll take the win on rasterization and the lower price any day - just like I did when I bought the 6900XT

16

Sensitive_Pizza6382 t1_izxjgzz wrote

Alright can some one explain it to me like I am 5?

12

justanothergamer_ t1_izy0cf8 wrote

A graphics card is a computer part that helps the game look pretty. Ray tracing makes light look pretty. One company, NVIDIA is making cards that have really pretty light reflections but theyre expensive. The other company, AMD, makes cards that are cheaper and make the game run pretty really well, especially on really expensive, big screens. But their pretty light reflections are not as good. But the pretty lights also make the game stuttery so we don’t even have it on a lot of the time.

39

GelbeForelle t1_izxjwyq wrote

AMD is bad at Raytracing, but their graphics cards are cheaper for similar performance.

33

0krizia t1_izxtc9o wrote

Knowledge bias, you’re still not explaining like he is a 5 year old.

AMD’s new graphics card is about as good as nvidia’s second best graphics card, the only exemption is one specific graphics setting in some games where RX 7000 series is loosing

−38

Lachimanus t1_izy5hpb wrote

Losing.

One has to add that the second best Nvidia is still more expensive than the AMD one and it beats it in everything but Ray Tracing.

11

themiddlethought t1_izxt4yl wrote

Not as good as ray tracing but current gen games aren't using them well so your mileage may vary

4

alcatrazcgp t1_j01iljj wrote

7900xtx matches or sometimes is better at rez performance than the 4080 while being 200$ cheaper, however it's rtx performance is 2 years old, about matching the 3090ti rtx performance.

2

justanothergamer_ t1_izxzi3y wrote

Believe me, I have a 3080 and I turn off ray tracing in most games. Not worth the performance hit. I’d rather play 144Hz 1440p

5

tripcy t1_izy00z2 wrote

If you play mostly single player games, and especially those that support RayTracing, the RTX 4080/4090 are better. If you play mostly competative multiplayer games, RTX 4090/4080 are still better, but not by as much. In other words, nVidia won't be dropping the price of their cards any time soon. In fact they're likely to re-up the price of the 4080 in countries where they had previously dropped it.

I was really rooting for AMD to have a big win here, and I'm sadly disappointed. I am however excited about AMDs next generation, as the 7000 series cards are essentially the first of a brand new architechture type that looks very promissing. The fact that they've designed things in a way that can be scalable, should honestly scare nVidia, much like AMD did with their CPU and terrifying intel. I guess I had just hoped the AMD would have been better considering they waited longer to release these cards.

4

CornholeCarl t1_izym0v8 wrote

I never hear people talk about how AMD is a full generation behind in RT compared to Nvidia and their new GPU's are on par in RT with 3000 series Nvidia cards so are they worse compared to 4000 series? yes but they're progressing at basically the same rate. Also, not every game supports RT although I do recognize it's becoming more and more popular but most people I know couldn't care less about RT and just want the highest frames possible. Also, who the fuck wants to spend 1k+ on a GPU so they can play games at 60 fps or am I missing something?

4

5hifty t1_izysnu3 wrote

IS Raytracing worth the $200 premium?

3

cainy1991 t1_izz2d3w wrote

Debatable, I got a 3070 for RT performance....

There is only one game that I actually liked the RT effects in, literally every other game I just turned off RT to save frames..

So since the 3000 series launch I played a total of one game with RTX... IMO RT ain't worth dick.

3

ben1481 t1_j02swhn wrote

The fact people upvoted your comment is embarrassing. Raytracing and similar technologies have been (and still are) the future. The problem is your card just can't handle it properly. More accurate lighting and reflections are not going anywhere, they'll only improve and become more mainstream.

−1

cainy1991 t1_j042u3j wrote

I never denied any of that.

But at this point in time it ain't worth a price premium.

1

ssuuh t1_j01914v wrote

No.

But I think much more gamers can just afford it as we are grown up and work.

I wouldn't bother but Im a software engineer and wanna play around with rtx.

1

ben1481 t1_j02sluf wrote

Depends on the games you play and how 'valuable' $200 is to you.

1

Dakeera t1_izz8gjp wrote

RT performance isn't far behind, it's a generation behind. In fact, they made better generational improvements compared to Nvidia when looking at the 3090/ti

3

Skynet-supporter t1_izxxz5g wrote

I bet nvidia will drop that 200 soon

2

Lachimanus t1_izy5nk1 wrote

Not really necessary. Their market share is like 10:1.

People are buying names not actual performance.

You are seeing it with the ridiculous amount of apple appliances out there.

2

SUPRVLLAN t1_j00csqz wrote

…but Apple’s performance has consistently been top tier.

1

Lachimanus t1_j00njld wrote

Sure, it has been also at top. But they definitely put on a good markup for their name and you have to look what else one pays money for to be part of apple.

Looking alone for the crazy way of handling repairing.

1

Skynet-supporter t1_izyknu4 wrote

Well i tried android first and it was total crap. Sw glitches, bad support etc. and still locked os, not an open source. So i know why i buy apple

−1

Lachimanus t1_izysufm wrote

People who want to do more than using a camera and messaging, yes.

But 90% of the buyers waste lots of money.

0

gaymer67 t1_izyih14 wrote

You lose my money when I need a mux switch to play a game.

2

stellarblackhole1 t1_izymzxx wrote

Way more interested in work load performances, I don't really game anymore what I do for fun is data modeling and development stuff. That and I want to get back into cad stuff for 3d printing.

2

693275001 t1_izz87ag wrote

Ray tracing is such a sham and it's impressive that Scamvidia somehow tricked the pc gaming world that it's valuable enough to be paying 20-30% more for

1

ben1481 t1_j02toz5 wrote

tell me you don't know what you are talking about without tell me

0

Bootytonus t1_izz8oyz wrote

I don't care for ray tracing, I'll be sticking with team red

1

no_user_name_person t1_izzjpux wrote

Looks like a bad deal. Draws more power than 4080. Less capabilities at productivity means less resale value. Reference cooler isn’t great and aibs puts it very close to 4080 cost. Plus it’ll probably be scalped on launch day while there’s endless 4080 supply.

1

rct1 t1_j002xp6 wrote

Anyone YTers got any good benches with a 2080 Ti @ 1440p?

I run an ultra wide 3440x1440p @ 100Hz. My Freesync range is 48-100Hz.

DLSS and ray tracing at 60fps or better is pretty much doable on any game at this res. I can push 4K/30/60 pretty well too if I need to.

People hate on the 2080 Ti cost but I had a top GPU for the lockdown, and most every piece of software has been updated to work better on my card. I still beat a 3070 in most benches and for my money, I’ve been happy with it. Go into settings, put it on ultra and play. If that means DLSS quality, then fine it does.

This is Nvidia fine wine lol. In 2 years, maybe new titles will require DLSS performance. In 4 years, I’ll buy a 4090 used. I’m cool.

1

lakerssuperman t1_j01qf8l wrote

So I don't game anywhere near what some of you do. I just don't have the time with kids and stuff taking priority, so I'm watching more from the sidelines. RT sure seems like a prestige feature that can look better, but not always across the board.

It seems a bit like 4k and HDR in movies. I've watched some movies where the uptick in resolution was pretty noticeable and the HDR did stand out, but only via direct comparison. Watching the 1080p SDR scaled to 4K still looks awesome and you wouldn't know you're missing much, even with the most pronounced 4K vs 1080p difference. On top of that you need a TV that can really do HDR to see the difference.

My point is, RT doesn't feel like a slam dunk across the board must have in every game and therefore I struggle to see why it's absolutely must have, especially if the raster performance of the card is so good for a cheaper number.

1

mandelmanden t1_j01v261 wrote

Good performance, but far more than most people are even close to needing.

Until everything is raytraced, performance will just be weird. It can just barely make playable non-scaled on my monitor - I don't have 4k and don't want it - but for everything that's not RT it just grossly overshoots performance. 3440x1440 would be 100s of frames in everything, and my monitor is 75hz... I'm not going to change that out.

So, let's see the 7800 XT and the 4070 what they can do. Though I guess I'll just wait another generation again.

1