Viewing a single comment thread. View all comments

czbz t1_j6pbd0x wrote

Afaik digital cameras don't generally use subpixels. Each pixel is roughly speaking only able to detect one of red, green or blue light - because it's covered by a filter that blocks the other colours. So if it only detects the red part how can we see whether or not there was a green thing there when we look at that pixel in the image? A computer has to guess what the colour is in that precise spot by using information from the neighbouring pixels.

That guessing is called 'debayering'. It means that effectively the image captures black and white textures in much higher resolution than variations in colour. Generally that fits well enough with what we want to look at and how we see things.

Our eyes are more sensitive to green light than to anything else, so they make cameras to match. Half the pixels are sensitive to green, one quarter to red, and one quarter to blue.

1