Viewing a single comment thread. View all comments

TrollErgoSum t1_jaa2qv9 wrote

It's called Supersampling and basically gives the computer more options for what to render for any given pixel.

If you "render" at a higher resolution so that each final pixel is 4 pixels in the supersample then the computer gets 4 reference points for what color that pixel could be instead of just one. It can then average those 4 values and get a cleaner value for that final pixel.

When you have high contrast areas (black against white for example) the computer can pick a cleaner average between the two areas (shades of gray) instead of only choosing between white and black.

272

rhamled t1_jaaem3q wrote

I love the eli5 hardware version of your comment. Just how we trick rocks with electricity to get to that sophistication makes me happy.

63

dbx999 t1_jabobdn wrote

So like anti aliasing but through time

2

circlesun22 t1_jabonhr wrote

Ok now explain it to me like I’m 2

9

iTwango t1_jabpcrh wrote

If you want to draw a colour, but only have a red crayon and a green crayon, it's a lot harder to make something that looks nice than if you have ten different red crayons and ten green crayons in between.

29

BaLance_95 t1_jabq7cd wrote

Let's take the question further. What are the benefits of rendering more frames than what your monitor can display?

2

holl0918 t1_jabyc46 wrote

There is no visable difference, but the imput lag (time between mouse click and thing on screen) is reduced.

16

liuthemoo t1_jabqasc wrote

its possible you could experience smoother feeling input

8

MINIMAN10001 t1_jac3glg wrote

Lets just make up some numbers

Imagine your screen refreshes at 100hz 10 ms per frame with a perfect sync your input will be delayed by that 10 ms.

But what if you ran 200 fps, 5 ms. Well now your input is only delayed by 5 ms because a frame is being drawn every 5 ms and your GPU will only be holding on to that newest frame created every 5 ms to submit to the monitor

This latency would be addative to any latency from mouse/keyboard to computer as well as your monitor's processing time known as input latency tagged as "Lag" in tftcentral reviews

This example Acer Nitro XV273 X review noted that particular monitor they could only estimate potentially 0.5 ms of input latency but marked it as 0 as the estimate was not an actual measurement and 2 ms of grey to grey response times giving it a total input latency of 2 ms. Whereas the average range one may see goes from 3 to 8 ms.

Also worth noting that processing time of 100 ms on a television isn't unusual and that's why TVs are generally not recommended for gaming use.

2

SifTheAbyss t1_jaeyvp7 wrote

Input lag will be decreased by up to 1 frame of what the monitor can display.

Say you have a 60hz monitor and try to render at 120fps, the completed images are sent 0.5 frames earlier(counting with 60fps as a base). You try to render at 300fps, the images arrive at 1/5th of the original 60's 1, so you win the remaining 0.8 frames.

As you can guess, this is most of the time not worth it though, as in that last example the GPU still does 5 times the work just for a marginal increase.

If you have a monitor that has Freesync/Gsync, the ready frames get sent immediately, so no need to render above the monitor's refresh rate.

2

WjeZg0uK6hbH t1_jabsgtu wrote

Assuming the time between frames are constant; there is no advantage. Having the GPU output more frames than the monitor can show, just means the monitor will ignore the extra frames. The GPU will do more work and heat your room up faster. So if your room is chilly it might be an advantage, depending what other kind of heating you are using. Most games have a frame limit, vsync or freesync setting, which in their own ways will limit the frame rate to something appropriate.

−1

AetherialWomble t1_jaby9nq wrote

That's fundamentally wrong. More frames and the information displayed on your screen will be newer.

For the sake of simplicity, let's say you had 1hz screen and GPU producing 1fps. By the time a frame would appear on your screen, it's already 1 second old. You get to see the frame that was generated 1 second ago.

Now, if you had 4fps, the frame you see would only be 0.25 seconds old.

Linus had a video a while back, comparing 60hz and 60fps vs 60hz and 240fps. The difference is MASSIVE.

https://youtu.be/OX31kZbAXsA

10

MrJTM t1_jad2pzq wrote

I think in your second example you would have horrendous screen tearing

1

AetherialWomble t1_jad52zv wrote

With a 1hz screen, screen tearing would be the least of your problems:)

6