Having Lots Of
Pixels In A Sensor Is No Guarantee Of High Image Quality
By Jeffrey Nielsen
Look at digital camera
ads today, and you’d think the number of pixels was the only
factor that mattered in choosing a camera. While the number of pixels
in important, there are many other elements that affect the quality
of an image.
If you studied a digital camera’s image sensor under a powerful
microscope, you’d see a flat sheet made of several layers of
silicon, with row after row of holes in the top layer. Each hole is
a pixel, sometimes referred to as a “well.” That’s
not a bad way to think about them because the light energy that falls
into those holes is what’s captured when you take a digital
photo.
When you snap a digital
picture, the camera’s lens focuses the image onto all those
pixels on the image sensor. If a pixel is in the bright part of a
scene, a lot of photons land in the well and produce a relatively
big electrical charge. In a darker part of the image, less photons
are captured in the pixel, resulting in a smaller electrical charge.
The size of the electrical charge in each pixel is the information
that goes into making a digital photograph.
One thing that’s obvious, then, is that the whole process starts
with the lens. If the image focused on the sensor is lower quality
than the pixels can capture, no amount of additional pixels will help.
As pixels go up, better lenses do make a big difference.
The electrical charges produced by the pixels are analog signals,
but a microprocessor only deals with digital signals. In a CCD, after
the data is read off the chip, the information goes to an analog/digital
converter. In a CMOS sensor, the analog-to-digital conversion takes
place right on the chip at each pixel.
Increasing the number of pixels doesn’t simply increase the
information coming from the chip. First, more pixels on a chip that
stays the same size means each pixel is going to be smaller. Each
pixel covers a smaller surface area, giving it less of a chance of
catching photons of light during an exposure.
Think of buckets out in the rain. If just a few raindrops fall, then
the smaller the size of each bucket, the less the chance each bucket
will have of getting any water in it. The result is that the image
sensor is less sensitive in low-light situations and more susceptible
to noise (because the smaller amount of light results in a need to
boost the signal, which also magnifies noise).
Smaller pixels reduce the range of brightest to darkest tones that
the sensor can detect, as well. Again, think of our buckets. If there’s
a very bright area in an image, comparable to a downpour on some of
the buckets, smaller buckets would more easily fill and overflow.
If a bucket, or pixel, is full, there’s no way to measure how
much more water (or light) lands on it. Full is full, and then it
stops measuring. Bigger pixels have more steps between empty and full,
allowing them to measure a wider tonal range from total black to pure
white.
Smaller pixels also mean the lens needs to be sharper to resolve details
onto each pixel. If the pixels get smaller, but the lens doesn’t
improve, then adjacent pixels could see the same thing. The result:
No real increase in image information from the increased pixel count.
Another way to design a camera is to keep the pixels the same size
and make the image sensor bigger. This way, the pixels keep their
sensitivity and dynamic range, but unfortunately, manufacturing costs
dramatically increase. In addition, the lens itself has to be physically
larger to project an image that covers the larger sensor and, in turn,
the whole camera has to be bigger.
Making sure light hits the image sensor at the proper angle also is
an issue in digital cameras. In a CCD or CMOS sensor, the light must
hit the pixels almost straight on to be captured properly in the light
well. Some camera manufacturers use an array of micro-lenses molded
over the image sensor, one lens for each pixel, to help aim the light
into the pixels. Lenses that collimate the light so the image hits
the sensor more directly also may help.
The final factor is color. A silicon chip only sees light intensity,
that is, a black-and-white image. To make a color image, most digital
camera sensors have an array of microscopic color filters over the
pixels. The filters are typically in a checkerboard pattern of red,
green and blue, with twice as many green pixels; green is where the
human eye sees the sharpest image. The processor combines the readings
of each set of red-blue-green-green pixels into the millions of colors
we see in the final image.
How well the software processes an image also affects the image quality
of your digital camera. Poor image processing can result in false
colors, moiré‚ patterns in the highlights or edging around
high-contrast areas of an image. Image processing is another area
where simply adding more pixels won’t necessarily improve the
picture quality.
When designing a camera, a manufacturer has to consider these elements
in addition to the pixel count of the image sensor. Sensor type, lens
design, image processor and processing software, even how much heat
the camera electronics generate, all can affect picture quality. Manufacturers
must weigh each of these items against the price someone will pay
for the camera, hence, they try to find new ways to get a better image
and keep cameras affordable.
Fujifilm altered the arrangement of the pixels on its Super CCD chip
to get more pixels in the same area using a honeycomb pattern.
Nikon developed a new image sensor called the LBCAST, which has a
relatively fast readout and lower power consumption, like a CMOS,
but with a simpler circuit design, so it has less electronic noise,
like a CCD. Its first use is in the pro-level Nikon D2h, and will
most likely show up in more cameras as the technology continues to
develop.
Chip manu-facturer Foveon stacked color pixel layers on top of each
other. The design allows the red, green and blue pixels to be kept
in perfect alignment.
In its pro-level digital SLRs, Canon reduces image noise with a three-layer
filter placed between the lens and image sensor (an “optical
infrared-cut low-end filter”).
Perhaps the ultimate in digital camera design are the panoramic cameras
on the robotic rovers exploring Mars. These cameras have only a 1-megapixel
chip, but NASA calls the images Imax-quality. The image sensor is
big, about a half-inch square, resulting in pixels about four times
the size of those on a typical 5-megapixel consumer camera. Also,
there are no color filters on the chip itself. To get a color image,
three separate exposures are taken through red, green and blue filters.
Plus, the lenses are custom-designed and ground to the tolerance of
an astronomical telescope.
As with any camera, image quality is a combination of the image sensor
working with the lens and the image processing scheme. You’ve
got to love the image quality, but it’s at a price only NASA
can pay.