Everything

Neither Digital Nor Color

Fun fact of the day: your new digital camera is neither digital nor color. Let me explain.

The image sensor of your digital camera is a device called a CMOS sensor. (CMOS actually just refers to the construction of the semiconductor so there are many Complementary Metal Oxide Semiconductors.) The sensor is made up of a rectangular grid of sensors called Photosites.

A photosite is something that measures photons (fundamental units of light) received in a given period of time. This process, however, is purely analog. The more quanta that hit the photosite, the more it’s electrical value changes. These values are siphoned OUT of the sensor and then sent to a Digital to Analog converter. So the sensor is just an analog device.

(As an aside, if you were to save the output of the Digital to Analog converter to a file, you would have a camera RAW file.)

The color part takes a little bit more to explain. A photon is a type of quanta; a fundamental physical particle. String theory aside, the pertinent fact here is that they do not contain color. It is the vibration pattern of the photon that give us color – not the photons themselves.

Think of it like this. Photons are like golf balls. We’re just counting how many golf balls we capture in a basket in any given time. The actual wiggles along path that the balls took to reach us is what we perceive a color.

So, if the photosites are only counting photons, then we’re not capturing color information. To capture color information, we need to employ something called a Bayer Filter. A bayer filter is a mosaic of red, green, and blue colored squares. Typically, they alternate in a regular pattern (except for some Fujifilm cameras) and overlay the photosites. Each photosite has one color filter over it. So, some photosets are red, some are green, and some are blue.

The sensor is typically arranged something like this:

Normal Bayer Filter
Normal Bayer Filter

If you notice, 4 isn’t evenly divisible by 3. So, there are typically twice as many Green photosites are there are Red or Blue. This is deliberate because evolution has gifted us with vision that is more sensitive to green than other colors of the spectrum. By heavying up on green photosites, you produce an image that is perceived as sharper.

Fujifilm Xtrans Sensor
Fujifilm Xtrans Sensor

Part of the job of a RAW processor (Capture One, Lightroom, etc) is to convert each photosite to pixels. If we just looked at a RAW image, it would be black and white and have a mosaic pattern to it (because of the Bayer Filter). The software interpolates neighboring photosite values into color pixels.

This is why a rendered TIFF file is larger than a RAW file at the same resolution. Each photosite in a RAW photo typically has 12 bits of information. A final TIFF tile normally has 24 or 48 bits per pixel of data (8x8x8 or 16x16x16). It is this interpolation step that produces a usable image but also balloons the file size.

A RAW processor does a lot more, but that is the subject of an upcoming post.

Leave a Reply