TheDankerGod t1_izxccnh wrote
I couldn’t give less of a shit about ray tracing so this is great news to me, AMD stepping up in the game
makedesign t1_izxu24n wrote
Layman here - what am I giving up with Ray tracing?
Neoptolemus85 t1_izym4g4 wrote
Ray tracing is used to give physically-accurate real-time lighting and reflections by simulating the bouncing of light through an environment. Without ray tracing, games have to fake things like sunlight illuminating a room through a window, or mirrors, puddles and other reflective surfaces.
The thing is, games have become really good at faking those things, so for a lot of people the difference is only noticeable when viewing side-by-side comparisons, and not really when actually playing in-game.
Ray tracing could be much more than just better lighting and reflections, but it's still a niche capability for enthusiasts with high-end gear. We won't really see the full potential of ray tracing until it becomes standard for the majority of gamers which is years away.
BIGSTANKDICKDADDY t1_izzeqhm wrote
>The thing is, games have become really good at faking those things, so for a lot of people the difference is only noticeable when viewing side-by-side comparisons, and not really when actually playing in-game.
In some side by side comparisons you may not notice any difference or even an advantage to the "non-RT" image. Offline baking allows us to perform extremely high quality path traced lighting and shadowing, taking hours and hours to illuminate a scene, then store that result on disk and load it back in when the game is played. The downside is that all of the geometry we use to perform those calculations must remain static! Because you aren't able to perform those calculations at runtime you can't allow the player to modify the scene and break the lighting/shadowing you baked into it. Modern processors have made complex physical interaction very achievable but utilizing offline lighting techniques means you can't make wide-scale use of them for interactivity.
Real-time ray tracing is a massive boon, not just to visual fidelity, but to interactivity in game environments going forward. It also alleviate a lot of manual effort we spend faking the lighting in environments to look as if we did have RT available. It will be interesting when we see the first game that doesn't offer a "non-RT" version because it was built from the ground up using RT and didn't incorporate any older workflows and techniques.
HugeHans t1_j01ma66 wrote
I think ray tracing is fantastic but I also think that realistic doesnt always mean better in this context. Like if I was trying to sell my home and took pictures for the ad with a really good camera the pictures would be very realistic but someone could make my home look much better in photoshop by tweaking the colors and contrast. One is realistic, one is what people like.
BIGSTANKDICKDADDY t1_j01nrnh wrote
Ray tracing isn't necessarily about creating "realistic" scenes. It's about creating a realistic model for the behavior of light. It doesn't preclude color grading or any other stylistic flair you want to inject into a final render. Fortnite is a cartoony game that looks pretty great with raytracing. Epic's Matrix demo adopts the same "cold blue" aesthetic from the films. Disney/Pixar use path tracing in their 3D works but nobody would say "Up" or "Frozen" look "realistic", you know?
werfenaway t1_izzqiww wrote
Ray tracing saves game development time by sparing developers having to do all the tricks to get it looking comparably good.
alc4pwned t1_j02eady wrote
Idk, Cyberpunk with raytracing on vs off is a big difference.
panzerfan t1_izxudzz wrote
About 1/3 less frames, increased overhead. Pretty stiff penalty even now.
jordoneus121 t1_izy478n wrote
Maybe on an amd gpu. It's typically around a 40% loss on an nvidia card. Ray tracing done well is gorgeous and absolutely worth the cost imo.
CazRaX t1_izy6tu2 wrote
Most people play at 1080p or 1440p, ray tracing is nice but most won't see the difference or care.
BarackaFlockaFlame t1_izy92od wrote
i 100% see the difference of ray tracing at 1080p. Spider-man is incredible looking with raytracing on, seeing the real-time reflection on glass windows as you swing by and then distorting the way they would on a glass material. the lighting is also so much more realistic in how it effects things around it. it was nice for a bit, but then i want higher frames for the combat so I turn it off. Would love to have it always on but my PC can't reliably do over 60fps with raytracing at the quality settings I'd want.
coworker t1_izyefjv wrote
I agree with the other guy. I have a 3080 and play 1440p on a G9. I couldn't tell a sizable difference in CyberPunk with RT on or off.
BarackaFlockaFlame t1_izyfwvu wrote
cyberpunk has dog shit ray tracing to be fair. that game looked the same to me as well, except with ray tracing off it performed much better. i'm not arguing that not having raytracing is dumb, im just saying that on games that utilize ray tracing well, you definitely can notice in 1080p, there just aren't many games that use it effectively. try out quake 2 rtx on steam. it is free and shows you what ray tracing should look like. they also just released a free update to portal that gave it a graphical overhaul to also use ray tracing. id recommend giving them a go if you want to actually see some good uses of it, and even at 1080p its crystal clear that there is a difference in how light interacts with the game world.
panzerfan t1_izy4txr wrote
I am somewhat on the fence when it comes to this. If people have OLED or QLED, then yeah, go for the raytracing eye candy. The problem to me is that it is a tiny minority.
KeyWerewolf5 t1_izye8la wrote
What do oled or qled have to do with raytracing?
panzerfan t1_izyegub wrote
I don't see why you would really embrace raytracing if your monitor can't handle the dynamic contrast.
durielvs t1_izyh5xu wrote
Ray tracing is not the same as hdr. Rt is used to be able to see reflections in real time on water, for example, and much more complex lighting in general, even though it does not have a great dynamic range.It has much more to do with the physics of light than with pretty colors.
KeyWerewolf5 t1_izyoynu wrote
But raytracing isn't about that. Its about more accurate reflections of light which lead to more realistic mirrors/bounce lighting/etc. Your resolution or hdr/dynamic contrast/whatever will certainly help a bit, but no, the benefits of raytracing are apparent across all screens. Watch every toy story on the same screen and tell me the lighting doesn't look better each sequel. Same thing.
BobisaMiner t1_j0h9m3d wrote
This is not even remotely true. Sure they'll look much better on an oled (qled is not in the same league.. and it's mostly made up by samsung) but raytracing looks good on all panel types.
It's cutting edge tech for pc gaming, of course it's going to be for a small group.
balazs955 t1_izy2n98 wrote
And nothing of worth was lost.
Megane_Senpai t1_j005d1y wrote
30-80% fps, based on title and GPU you use.
rakehellion t1_j0irkjj wrote
Better graphics.
SatanLifeProTips t1_izymoiq wrote
I would say that. Then I enabled ray tracing and wow. I’d happily trade a few FPS for some beautiful liquid eye candy. Having light behave like real light is amazing. Shadows, reflections, god rays. It’s beautiful.
Especially on a 65” OLED.
ArgonTheEvil t1_izzd8f7 wrote
I saw what Cyberpunk looked like fully RT’d out and I’ve been putting off playing the game until I can experience it in its full glory at a good frame rate. Then there’s games like Dying Light 2 where it goes from looking exactly like the first game without ray tracing, to a hell of a next gen game with it.
My 3070 just can’t hack it at 3440 x 1440 with all the RT features turned on, and the 7900XTX only seems to be at 3090-3090Ti levels with RT.
The 4080 is about double the performance of my 3070, but given where the 4090 stands, the most I’d pay is $900. So I guess I’m waiting til then or next gen, and seeing if AMD does better next round.
SatanLifeProTips t1_izzkawd wrote
Cyberpunk wasn’t the best example, but it ran great most of the time on my 3070oc at 4k. Got a good CPU to match? I’m running a 5800x and it seemed fine.
I do have the factory overclocked version (no overclocking enabled besides the factory setup) so maybe something inside ran better? I know it has quicker memory. I also built the mother of all cooling systems with 3d printed ducting feeding the CPU and GPU so I was always under 60C when under full stress. That helped keep it at full clock speed.
There were certainly times when cyberpunk ran like dog shit but I chalked that up to bad coding. The game itself was hit and miss at best for quality.
ArgonTheEvil t1_izzp2fl wrote
I’m using a 5800X3D and while my 3070 isn’t overclocked, it’s running perfectly cool even under full load, so it’s boosting to its max within stock power limits.
But I’m aiming for RT turned on and all the way up, 60fps 1% lows, and DLSS at no lower than quality setting. I don’t know what qualifies as “great” for you but my 3070 wasn’t delivering my version of great last time I tried it in Cyberpunk. Most other games it’s plenty or more than enough, but I’ll hold off playing a game like Cyberpunk until I can do it with all the eye candy.
rosesandtherest t1_j007f8a wrote
lol you should see comments on any other sub or tech site, the gpu is doa trash
blucasa t1_izycjz4 wrote
Could not agree more, shit is sooo blown out of proportion its not even funny.
Viewing a single comment thread. View all comments