scanner breaks the original down into tiny dots
and measures their brightness. However, even color
scanners are, paradoxically, color-blind! In order
to still be able to generate colorful scans, scanners
use a trick: by inserting color filters in the
primary colors red, green and blue, the respective
color components are measured separately. If you
superimpose these three color separations, which
the scanning software naturally does automatically,
you get a color image whose elements are defined
by mixtures of these three primary colors. However,
each one of these individual dots has a homogenous
color, meaning that the color doesn't vary within
one dot. These virtual "mosaic tiles" (you could
define them as the atoms of the digital image)
are referred to as dots or pixels (contraction
of picture elements). The exciting thing about
digital imaging in general is the fact that you
no longer deal with the image as whole, but rather
with its individual components. The software enables
the image editor to work with each picture element
individually, like changing its color, brightness
or position, or copying its characteristics to
adjacent picture elements.
The term resolution is closely related to the
subject of breaking down a paper print or slide
into pixels. Resolution is the magic word these
days and crops up in every kind of communication
on the subject of digital images, scanners or
dpi stands for "dots per inch". It actually
refers to the dots that a printer can print on
paper. In contrast, ppi is the abbreviation for
"pixel (or points) per inch". The following
paragraphs will clearly explain why printer dots
are not the same as image pixels. Although ppi
is actually the more correct term for the resolution
of a scanner - a scanner does, after all, divide
an image into pixels and not printer dots - the
abbreviation dpi has also become an established
term to refer to resolution. For example, you
will come across dpi in scanning software everywhere
where you have to set the scanning resolution,
but also in advertising, the trade press and the
various image editing programs. In order to avoid
confusion, this chapter also uses the term dpi.
example, if a scanner has a resolution of 600
dpi, this means nothing more than that its scan
array can divide one inch of the original into
600 elements. To do so, the scanner has a corresponding
number of light-sensitive receptors, each of which
measures the brightness of a tiny section of the
original. Consequently, a 600 dpi scanner is capable
of dividing 2.54 centimeters of the original (1
inch = 2.54 centimeters) into a maximum of 600
the maximum because you naturally don't always
have to scan at "full power". Therefore,
the scan resolution can be reduced as needed using
the software, so that one inch is only divided
into 300, 200 or 72 dpi, for example.
and interpolated resolution
A distinction must be made between two terms when
referring to the resolution of a scanner. The
optical (or physical) resolution refers to the
actual number of photocells per unit length. In
contrast, the interpolated (or calculated, or
maximum) resolution is the result of a kind of
software trick. Based on the optical resolution,
the scanning software generates additional pixels,
calculated by finding the mean, and inserts them
in between the pixels actually scanned. Used in
moderation, you can thus increase the amount of
data. However, it must be kept in mind that interpolation
does not lead to the addition of really new image
data. In other words, an optical resolution of
600 or 1200 dpi, for example, generates better
results than an interpolated resolution.
The scan resolution thus defines how many pixels
one inch of the original is divided into. As will
be illustrated later on, this has an effect on
the file size and also on the dimensions of the
all dots are created equal
The resolution of a scanner, but also that of
a printer, is often given in dpi, and this is
where things start to get a little complicated.
The main culprit behind the confusion on the subject
of resolution is the fact that the term resolution
and the abbreviation dpi mean something completely
different for a scanner (and therefore the correct
unit should be ppi instead of dpi) than they do
for a printer, and that not all dots are created
The reason is the different way they handle color.
A scanner with 600 dpi generates 600 pixels per
inch, each of which can have one of 16.8 million
A printer (ink-jet, dot-matrix, laser) with a
resolution of 600 dpi prints a maximum of 600
dots (individual drops of ink or toner dots) on
one inch of paper, each of which can be either:
One of two colors (black-and-white printer, namely
black or the color of the paper),
One of four colors (printers with three inks),
One of five colors (printers with four inks),
but never one of 16.8 million colors.
already clearly illustrates that a scanner "dot"
(which, as described above, should correctly be
referred to as a pixel) in no way corresponds
to a printer "dot". In order to avoid
insulting our eyes with poster-like two-tone photos,
the printer must therefore simulate each individual
pixel of the scanned photo (i.e. scanner dpi)
by juxtaposing and overlapping numerous printer
dots (i.e. printer dpi) in the primary colors.
Due to the fact that the printer "consumes"
numerous printer dots in order to render a single
pixel, its effective print resolution is substantially
lower than the specified dpi value.
It is well known, however, that there are also
printers which produce better photo quality with
just 300 dpi than high-resolution ink-jet printers
with 720 or even 1440 dpi. The reason is simple:
thermosublimation printers are capable of printing
one of 16.8 million colors for each individual
dot of the digital image, which is why they are
also called "continuous tone" printers.
Thus, a pixel really does correspond to a printer
dot in this case.
The following must be kept in mind when scanning:
the image is to be output on an ink-jet or laser
printer, it doesn't help much to set the scan
or image resolution to the value of the printer
resolution. It you really wanted to do so, even
a print size of 9 x 13 centimeters on a 1440 dpi
printer would require a data volume of 107 megabytes!
That would be going a little bit too far because,
as described, the printer is incapable of producing
1440 pixels per inch. The effective printer resolution
varies from model to model and can only be determined
more precisely by experiment. However, you'll
usually be on the safe side with a scan or image
resolution of 300 dpi. Less could impair the print
quality and more would not improve it.
contrast, things are easier if the image is to
be sent to a continuous tone printer (e.g. thermosublimation
or donor printer) or produced as a real photo
by a digital-photochemical process: set the image
resolution to the same value as the printer resolution,
which is usually around 200, 300 or 400 dpi, depending
on the model.
How much is too much?
is generally true that the higher the resolution
of a scanner, the more details it can read off
the original. However, there are natural restrictions
in this case, too. For example, it makes little
sense to scan a tiny original, like a passport
photo, at a huge resolution, like 5000 dpi. The
amount of information the scanner is theoretically
capable of recording just doesn't exist on the
small original. Therefore, scanners with a resolution
of 600 or 1200 dpi are almost always adequate
for everyday use.
depth and color resolution
A pixel is not just a dot at a specific location
on the digital image, but rather also has a specific
color (as mentioned, usually one color from a range
of 16.8 million possibilities). In this context,
it cannot be said that there are exactly 16.8 million
colors in nature and that the computer pioneers
defined this value for that reason.
color of each dot in the digital image can be
defined by indicating its red, green and blue
value. One byte of memory is available for each
primary color. In turn, one byte corresponds to
eight bits and a bit can be equal to either 0
or 1. This storage capacity enables the definition
of exactly 256 different shades: from 00000000
to 10000000, 11000000, 10100000, etc. up to 11111111.
In other words, one byte or eight bits per primary
color, can describe 256 shades of that color.
And because a pixel consists of three primary
color values, as explained, this description language
x 256 x 256 = 16,777,216 or about 16.8 million
gamut of colors is logically referred to as "true
color". The corresponding devices, such as
scanners, digital cameras or graphic cards, have
a sample depth of 3 x 8 bits = 24 bits.
If all the values for red, green and blue
are set to zero, they result in black, while
255, 255, 255 produces white. If the three
color proportions are equal (such as 150,
150, 150), they define a neutral gray.
of you who've held out up to this point now have
an easy opportunity to discover another mystery
behind the digital image: the relationship between
the image dimensions, resolution and file size.
The best way to illustrate this relationship is
by an example: Scanning a 10 x 15 cm photo at
a resolution of 300 dpi results in (reminder:
1 inch = 2.54 centimeters):
centimeters/2.54 cm) x 300 = 1771 pixels horizontal
(10 centimeters/2.54 cm) x 300 = 1181 pixels vertical
the entire scan consists of: 1771 x 1181 = 2.1
to the fact that, as described above, each pixel
requires 3 bytes of memory for the color data,
the file consists of roughly 6.3 million bytes.
If you divide this value by 1024, you get the
size in kilobytes (6130) and if you divide this
by 1024 again, you get the customary file size
in megabytes (6).
Now, if the monitor, the image editing software
and the photo-quality printer can work with a
palette of 24 bits or 16.8 million colors at best,
why do high-quality scanners boast about sample
depths of 30 or even 36 bits? The reason is simple:
the greater the sample depth, the better the scanner
can distinguish between fine graduations in the
shadows and highlights. It has one billion shades
at its disposal at 30 bits, and even 68 billion
at 36 bits. Although this richness of color is
ultimately reduced back down to the common 24
bits for further processing, the critical areas
are still rendered with much greater differentiation
due to the more detailed input.
The value of the dynamic or tonal range also says
something about the quality of high-end professional
scanners, in particular. On film material, the
tonal values are usually measured by the optical
density. They theoretically range from 0D (clear,
transparent film) to 4D (black, fully exposed
film), but are usually around 0.2 to 3.8 in practice.
When it comes to scanners, the dynamic range,
which is calculated using a mathematical formula,
defines the capability to distinguish between
a specific range of the entire spectrum from 0
to 4. It can generally be said that the greater
the dynamic range, the better. Unfortunately,
this usually also means considerably higher prices.
hard facts about scanning
You can use a scanner to make a blurry photo sharp?
False, because the GIGO rule applies to scanning
too, and this rule says: Garbage In, Garbage Out.
Although scanning and image editing software offers
various sharpening filters, they primarily serve
to reduce the slight blur caused by scanning.
No creative tricks in the world can turn a really
blurry original into a sharply focussed picture.
The lesson to be learned is that the optimization
functions of a scanner are no substitute for good
photography. Even the best scanner can't recreate
image details which can no longer be made out
in the original, like highly overexposed areas.
The higher the resolution the better? False, because
there is a right resolution for every application.
A low resolution is not suitable for printing
a poster, while a high resolution would not only
be superfluous for displaying an image on an Internet
page, but even problematic. The image would require
long transmission times and not fit completely
on the screen. You need a very high resolution
to scan color photos, while a low resolution is
adequate for scanning black-and-white line art?
False, on the contrary! You need a high scan resolution
in order to reproduce line art, graphics or text
without staircasing (jaggies). In contrast, photos
usually have varied content without straight lines
and edges, so that a low resolution is not so
noticeable. If I want to print a picture on my
720 dpi printer in optimum quality, I also have
to scan it at 720 dpi? False, because the 720
dpi printer can't really print 720 pixels per
inch - see Not all dots are created equal! Scan
first and take care of optimization later? False,
because optimization steps taken during the scanning
process, such as contrast correction, sharpening,
descreening or sizing, usually produce better
quality results, since they can refer to a greater
amount of data (such as the 36-bit raw data) than
a standard image editing program.