SifTheAbyss

SifTheAbyss t1_jee3x3g wrote

Multiplying with negative is just turning around the direction of whatever you're doing.

Using the classic fruit example, adding gives you more apples, adding "negative apples" takes away apples. You can draw a card that says "you need to hand over 1 apple".

Now, adding 1 of these cards really just removes 1 apple. It's like a negative apple.

What happens if we remove 1 of these cards? You have 1 more apple than you had before.

1

SifTheAbyss t1_jaeyvp7 wrote

Input lag will be decreased by up to 1 frame of what the monitor can display.

Say you have a 60hz monitor and try to render at 120fps, the completed images are sent 0.5 frames earlier(counting with 60fps as a base). You try to render at 300fps, the images arrive at 1/5th of the original 60's 1, so you win the remaining 0.8 frames.

As you can guess, this is most of the time not worth it though, as in that last example the GPU still does 5 times the work just for a marginal increase.

If you have a monitor that has Freesync/Gsync, the ready frames get sent immediately, so no need to render above the monitor's refresh rate.

2

SifTheAbyss t1_iyc5rk4 wrote

Because multiplication is a shorthand of repeated addition.

2 × 3 is 3 + 3, 3 × 5 would become 5 + 5 + 5, etc.

So 1 + 2 × 3 will give you 1 + (3 + 3). That 1 is just a simple number, it doesn't tell you anything about how many times you're supposed to add 3.

−1