The metric most often used by camera manufacturers and marketers to tout their products has been pixel count. That’s a shame, but it was probably inevitable—it’s easy to measure, and consumers are used to the idea that more is better. However, the number of pixels is a measure of quantity, not quality.
To be sure, in the beginning—the 1990s—there was a great need for more pixels. But by 2000, pixel counts plateaued at 3.3 megapixels. At that number, the sensor was relatively cheap to produce, and the resulting images had just enough resolution for decent 8½- by 11-inch color enlargements from inexpensive inkjet photo printers. For a while, manufacturers competed not on size but with new features, such as geotagging, optical image stabilization, extended zoom ratio, and video. Then pixel escalation resumed with a vengeance. Nowadays, 12-, 14-, and 16-megapixel point-and-shoot digitals are the rule rather than the exception, and cellphones routinely offer 5-, 8-, and even 12-megapixel resolution.
What manufacturers didn’t bother explaining was what a pixel is—and why we should care how many we have. Essentially, at the heart of every digital camera is an image sensor. The lens focuses photons reflected by the scene being photographed onto that image sensor. Etched into the image sensor’s silicon are pixels (short for “picture elements”)—technically, photoreceptor or photodiode sites. Each pixel is a single point that collects the electrons, which are then interpreted into information about color and light. As in the Postimpressionist style of pointillism, which used thousands of paint dots to create a work of art, the signals from pixels are processed into a recognizable image. The more pixels, the more information collected and the larger the photo.
But bigger pictures don’t necessarily mean better pictures. In fact, pixel count alone cannot ensure a quality image. If it were otherwise, then a US $139 16-megapixel Nikon Coolpix S3300 point-and-shoot camera would produce pictures as good as those from a $5995 professional Nikon D4 digital single lens reflex (DSLR) with the same number of megapixels. Or, to take the comparison to an even greater extreme, the recently unveiled (and still unpriced) 41-megapixel Nokia 808 PureView smartphone (yes, Virginia, there is a 41-megapixel phone camera) would have image quality similar to that of a $17 500 40-megapixel Phase One 645DF camera. Some obvious differences account for the higher prices of the Nikon D4 and the Phase One 645DF: much faster performance, higher quality construction, more durable body, superior ergonomics, more precise settings and adjustments, interchangeable lenses, and so on. But those factors don’t explain the most important advantage: vastly superior image quality.
If the number of pixels doesn’t directly relate to image quality, what does? Actually, there are several factors that define and determine a digital camera’s image quality: the physical size of the pixels and the image sensor, the filters (and usually the microlenses) bonded to the image sensor, the firmware that processes pixel data, and the camera lens.
But it all centers on the individual pixels.
Pixels on an image sensor are analogous to a bunch of red, green, and blue paint buckets placed side by side. (Red, green, and blue combine to create all colors.) The bigger the buckets, the more paint (electrons) they can capture.
Here’s where it gets a little tricky, so it’s best to explain by another analogy. Suppose you need to estimate how much rain falls onto a farm, and you have only a minute’s worth of rainfall to do your measurements. Imagine that you spread 100 empty soup cans around the property, capped by funnels that are 10 centimeters in diameter. You might collect only a few hundred drops in each. Now suppose you could double the size of the funnel. The amount that can be collected increases exponentially. Calculating how much rain falls on the field by extrapolating the water collected with, say, 1000‑cm funnels will yield vastly more accurate results. Here’s why: If your raindrops are really photons, the signal-to-noise ratio is dominated by the fact that the noise is equal to the square root of the number of photons. Thus the more rain each soup can collects, the higher the signal-to-noise ratio.