You are on page 1of 108

How cellphones work

by Chris Woodford. Last updated: July 19, 2020.

W alking and talking, working on the train, always in contact, never

out of touch—cellphones have dramatically changed the way we live and


work. No one knows exactly how many little plastic handsets there are in the
world, but the current estimate is that there over 8.3 billion subscriptions.
That's more than the planet's population! In developing countries, where
large-scale land line networks (ordinary telephones wired to the wall) are few
and far between, over 93 percent of the phones in use are
cellphones. [1] Cellphones (also known as cellular phones and, chiefly in
Europe, as mobile phones or mobiles) are radio telephones that route their
calls through a network of masts linked to the main public telephone network.
Here's how they work.
Photo: Most people now use smartphones as their cellphones, which are actually small computers with
cellphone circuitry built in. Back in the 1990s, cellphones were simpler and could only be used for making voice
calls. Now networks are faster and capable of handling greater volumes of traffic, smartphones are used as
portable communication centers, capable of doing all the things you can do with a telephone, digital camera,
MP3 player, GPS "sat nav," and laptop computer.

Contents

1. Cellphones use wireless technology


2. How cellphone calls travel
3. How cellphone masts help
4. What cells do
5. Types of cellphones
6. The world of cellphones
7. Cellphones and mobile broadband
8. What's inside your phone?
9. Who invented cellphones?
10. Find out more
11. References

Cellphones use wireless technology

Photo: Cellphones as they used to be. This Nokia dates from the early 2000s and has a slide-out keypad.
Although it has a camera and a few other basic functions, it doesn't have anything like the computing power of
a modern smartphone. Phones like this are sometimes called "handhelds" or "feature phones" to distinguish
them from iPhones and other smartphones.

Although they do the same job, land lines and cellphones work in a completely
different way. Land lines carry calls along electrical cables. Cut out all
the satellites, fiber-optic cables, switching offices, and other razzmatazz, and
land lines are not that much different to the toy phones you might have made
out of a piece of string and a couple of baked bean cans. The words you
speak ultimately travel down a direct, wired connection between two
handsets. What's different about a cellphone is that it can send and receive
calls without wire connections of any kind. How does it do this? By
using electromagnetic radio waves to send and receive the sounds that would
normally travel down wires.

Whether you're sitting at home, walking down the street, driving a car, or
riding in a train, you're bathing in a sea of electromagnetic
waves. TV and radio programs, signals from radio-controlled cars, cordless
phone calls, and even wireless doorbells—all these things work using
electromagnetic energy: undulating patterns of electricity and magnetism that
zip and zap invisibly through space at the speed of light (300,000 km or
186,000 miles per second). Cellphone networks are by far the fastest growing
source of electromagnetic energy in the world around us.

Sponsored links

How cellphone calls travel


When you speak into a cellphone, a tiny microphone in the handset converts
the up-and-down sounds of your voice into a corresponding up-and-down
pattern of electrical signals. A microchip inside the phone turns these signals
into strings of numbers. The numbers are packed up into a radio wave and
beamed out from the phone's antenna (in some countries, the antenna is
called an aerial). The radio wave races through the air at the speed of light
until it reaches the nearest cellphone mast.

Photo: Engineers repair a cellphone mast. Photo by Brien Aho courtesy of US Navy.

The mast receives the signals and passes them on to its base station, which
effectively coordinates what happens inside each local part of the cellphone
network, which is called a cell. From the base station, the calls are routed
onward to their destination. Calls made from a cellphone to another cellphone
on the same network travel to their destination by being routed to the base
station nearest to the destination phone, and finally to that phone itself. Calls
made to a cellphone on a different network or a land line follow a more lengthy
path. They may have to be routed into the main telephone network before
they can reach their ultimate destination.
How cellphone masts help
At first glance, cellphones seem a lot like two-way radios and walkie talkies,
where each person has a radio (containing both a sender and a receiver) that
bounces messages back and forth directly, like tennis players returning a ball.
The problem with radios like this is that you can only use so many of them in a
certain area before the signals from one pair of callers start interfering with
those from other pairs of callers. That's why cellphones are much more
sophisticated—and work in a completely different way.

A cellphone handset contains a radio transmitter, for sending radio signals


onward from the phone, and a radio receiver, for receiving incoming signals
from other phones. The radio transmitter and receiver are not very high-
powered, which means cellphones cannot send signals very far. That's not a
flaw— it's a deliberate feature of their design! All a cellphone has to do is
communicate with its local mast and base station; what the base station has
to do is pick up faint signals from many cellphones and route them onward to
their destination, which is why the masts are huge, high-powered antennas
(often mounted on a hill or tall building). If we didn't have masts, we'd need
cellphones with enormous antennas and giant power supplies—and they'd be
too cumbersome to be mobile. A cellphone automatically communicates with
the nearest cell (the one with the strongest signal) and uses as little power to
do so as it possibly can (which makes its battery last as long as possible and
reduces the likelihood of it interfering with other phones nearby).

What cells do
So why bother with cells? Why don't cellphones simply talk to one another
directly? Suppose several people in your area all want to use their cellphones
at the same time. If their phones all send and receive calls in the same way,
using the same kind of radio waves, the signals would interfere and scramble
together and it would be impossible to tell one call from another. One way to
get around this is to use different radio waves for different calls. If each phone
call uses a slightly different frequency (the number of up-and-down
undulations in a radio wave in one second), the calls are easy to keep
separate. They can travel through the air like different radio stations that use
different wavebands.

That's fine if there are only a few people calling at once. But suppose you're in
the middle of a big city and millions of people are all calling at once. Then
you'd need just as many millions of separate frequencies—more than are
usually available. The solution is to divide the city up into smaller areas, with
each one served by its own masts and base station. These areas are what we
call cells and they look like a patchwork of invisible hexagons. Each cell has
its base station and masts and all the calls made or received inside that cell
are routed through them. Cells enable the system to handle many more calls
at once, because each cell uses the same set of frequencies as its
neighboring cells. The more cells, the greater the number of calls that can be
made at once. This is why urban areas have many more cells than rural areas
and why the cells in urban areas are much smaller.

How cellphone cells handle calls

This picture shows two ways in which cells work.

Simple call

If a phone in cell A calls a phone in cell B, the call doesn't pass directly
between the phones, but from the first phone to mast A and its base station,
then to mast B and its base station, and then to the second phone.

Roaming call
Cellphones that are moving between cells (when people are walking along or
driving) are regularly sending signals to and from nearby masts so that, at any
given time, the cellphone network always knows which mast is closest to
which phone.

If a car passenger is making a call and the car drives between cells C, D, and
E, the phone call is automatically "handed off" (passed from cell to cell) so the
call is not interrupted.

The key to understanding cells is to realize that cellphones and the masts they
communicate with are designed to send radio waves only over a limited
range; that effectively defines the size of the cells. It's also worth pointing out
that this picture is a simplification; it's more accurate to say that the masts sit
at the intersections of the cells, but it's a little easier to understand things as
I've shown them.

Types of cellphones
The first mobile phones used analog technology. This is pretty much how
baked-bean can telephones work too. When you talk on a baked-bean can
phone, your voice makes the string vibrate up and down (so fast that you can't
see it). The vibrations go up and down like your voice. In other words, they are
an analogy of your voice—and that's why we call this analog technology.
Some land lines still work in this way today.

Most cellphones work using digital technology: they turn the sounds of your
voice into a pattern of numbers (digits) and then beam them through the air.
Using digital technology has many advantages. It means cellphones can be
used to send and receive computerized data. That's why most cellphones can
now send and receive text (SMS) messages, Web pages, MP3 music files,
and digital photos. Digital technology means cellphone calls can
be encrypted (scrambled using a mathematical code) before they leave the
sender's phone, so eavesdroppers cannot intercept them. (This was a big
problem with earlier analog phones, which anyone could intercept with a
miniature radio receiver called a scanner.) That makes digital cellphones
much more secure.

The world of cellphones


Cellphones have dramatically changed the way the world connects. In the
early 1990s, only one per cent of the world's population owned a cellphone;
today, in a growing number of countries people spend more time on their
mobiles than on their landlines. According to the ITU-T, in 2001, only 58
percent of the world's population had access to a (2G) cellphone network; by
2019, that had risen to 98.8 percent. Also by 2019, there were over 8.3 billion
cellphone subscriptions—slightly more than the number of people on the
planet. Cellphones have also powered a big leap in Internet access. At the
end of 2016, mobile (smartphone and tablet) Internet traffic passed desktop
traffic for the first time ever. By the end of 2019, 83 percent of the world's
people had active, cellphone-based, mobile broadband subscriptions, which is
over five times as many as have traditional wired broadband (just 14.9
percent). [2]

Chart: Cellphone subscriptions: The most dramatic cellphone growth has happened in developing countries,
which now represent around 80 percent of subscriptions. Source: Drawn using 28 October 2019 data
from International Telecommunications Union (ITU).

Cellphones are also used in different ways by different people. Back in the
early 2000s, cellphones were used entirely for voice conversations and
sending short "texts" (text messages, also known as SMS messages). A lot of
people owned a mobile phone purely for occasional emergency use; and that
still remains a popular reason for owning a phone (according to the FCC,
about 70 percent of all 911 emergency calls in the United States are made
from cellphones). Today, smartphones are everywhere and people use them
for emailing, browsing the web, downloading music, social media, and running
all kinds of apps. Where old-fashioned cellphones relied entirely on a decent
signal from a cellphone network, smartphones hop back and forth, as
necessary, between ordinary networks and Wi-Fi. Where old cellphones were
literally "mobile phones" (wireless landlines), modern smartphones are
essentially pocket computers that just happen to make phone calls. You can
see just how much phones have changed internally in the photos in the box
below.

Cellphones and mobile broadband


If you want to find out how cellphone networks have evolved from purely voice
networks to form an important part of the Internet, please see our separate
article on broadband and mobile broadband. It also covers all those confusing
acronyms like FDMA, TDMA, CDMA, WCDMA, and HSDPA/HSPA.

What's inside your phone?

Photo: Cellphones past and present. Left: A Motorola V66 from about 2000, a Nokia 106 from about 2010, and
an LG G series smartphone. I will be taking apart the Motorola and the LG.

A broken phone is a wonderful thing if, like me, you enjoy figuring out how
things work. Not surprisingly, there's much more going on inside a modern
smartphone than inside the kind of basic cellphone people used to carry about
20 years ago. Old phones were just phones; smartphones are computers
packed with all kinds of gadgetry, from fingerprint readers to electronic
payment chips. But though phones have changed dramatically, the problems
of designing a new handset are, in many ways, just the same as they always
were: How do you pack all these components into a small enough space,
keep their total weight down, and avoid them overheating? How do you
ensure critical components like microphones, loudspeakers, and antennas
continue to work effectively even when they're miniaturized?

Inside a classic phone

The biggest difference between old phones and new ones is that older ones
have keyboards and small LCD screens, while smartphones
have touchscreens that do away with the need for a keyboard altogether (they
do still need a few buttons for switching the power on and off and controlling
speaker volume). In an old phone, the keyboard's typically one of the
"membrane" kind: instead of moving keys, it has squashy rubber buttons that
push down on electrical contacts on a printed-circuit board (PCB) below.

Photo: Left: The top side of an old Motorola phone keyboard is what's called a rubber membrane, a thin sheet
of rubbery plastic with "keys" that press down to make electrical contact with the circuit board below. Right:
Each key pushes a little round peg against the corresponding part of the circuit board (the small dots). The
keyboard is also packed with LEDs (the eight rectangles with white outlines) that make it light up when you
make or take a call.

Unfortunately, digital gadgets aren't anything like as interesting (or as easy to


figure out) as mechanical things: most of the good stuff happens inside chips,
out of sight, and you can't figure out how a chip works just by looking at it.
Taking the keyboard off, there's very little of interest in the board beneath, but
do notice the gold antenna running all the way around it. That's why a
cellphone like this does not need a long, telescopic (pull-out) antenna.
Photo: The main circuit board from a Motorola V66 phone is directly underneath the keyboard and above the
battery compartment.

The other side of the circuit board is a little bit more interesting:

1. LCD screen, connected to the keyboard unit


by a ribbon cable.
2. Earphone socket.
3. Battery connector
4. Battery charger and cable connector for
hooking up to a computer.
5. Heatsinks/screening for chips on the circuit
board.
6. Piezoelectric buzzer.
7. Buzzer control chip
8. Antenna connector—links a small external
antenna to the gold antenna running round the
circuit board.
Photo: The back of the main circuit board from a Motorola V66 phone.

Inside a smartphone

There's quite a lot more going on inside a smartphone, as you'd expect. I've
not taken the screen apart (it's directly below the circuit board on the right-
hand side), but here are some of the other things you'll find:
Photo: The main circuit board from a more modern LG G-series smartphone.

1. Contact connections between upper (left


photo) and lower (right photo) parts of the
circuit board.
2. Heatsink/screening for processor chips. (The
gray stuff you can see here is thermal paste—
a kind of heat-conducting goo—that helps to
improve cooling.) The power on/off button is
under here.
3. NFC antenna connectors (for contactless
payments).
4. Infrared focusing beam for camera.
5. 13-megapixel rear digital camera.
6. Flashlight/camera flash.
7. Quad-core Qualcomm Snapdragon processor
chip.
8. Micro SD card slot (allows storage to be
extended to 32GB).
9. Micro-SIM card slot
10. Lithium-ion battery (3000 mAh capacity).
11. Entirely plastic case with a "brushed
metal" finish gives the appearance of a metal
case with the fingerprint smudges.
12. Headphone connector.
13. Microphone.
14. USB and charging connector.
15. Loudspeaker.
16. Screwed-down plastic shim protects the
circuit board and components when you open
up the case to change the battery.
17. Screws!
18. More contact connections between upper
and lower boards.

Who invented cellphones?


Photo: Martin Cooper's original radio telephone system (cellphone) design, submitted as a patent application in
1973. Note how the mobile part forms an entirely separate system (shown in blue, on the right) that
communicates with the existing public network (shown on the left in red). Individual cellphones (turquoise on
the extreme right) communicate with their nearest masts and base stations using radio waves (yellow zig-
zags). Patent drawing courtesy of US Patent and Trademark Office.

How did we get from land lines to cellphones? Here's a quick history:

 1873: British physicist James Clerk


Maxwell (1831–1879) published the theory of
electromagnetism, explaining how how
electricity can make magnetism and vice-
versa. Read more about his work in our main
article on magnetism.
 1876: Scottish-born inventor Alexander
Graham Bell (1847–1922) developed the first
telephone while living in the United States
(though there is some dispute about whether
he was actually the original inventor). Later,
Bell developed something called a
"photophone" that would send and receive
phone calls using light beams. Since it was
conceived as a wireless phone, it was really a
distant ancestor of the modern mobile phone.
 1888: German physicist Heinrich
Hertz (1857–1894) made the first
electromagnetic radio waves in his lab.
 1894: British physicist Sir Oliver
Lodge (1851–1940) sent the first message
using radio waves in Oxford, England.
 1899: Italian inventor Guglielmo
Marconi (1874–1937) sent radio waves
across the English Channel. By 1901. Marconi
had sent radio waves across the Atlantic, from
Cornwall in England to Newfoundland.
Marconi is remembered as the father of radio,
but pioneers such as Hertz and Lodge were no
less important.
 1906: American engineer Reginald
Fessenden (1866–1932) became the first
person to transmit the human voice using
radio waves. He sent a message 11 miles
from a transmitter at Brant Rock,
Massachusetts to ships with radio receivers in
the Atlantic Ocean.
 1920s: Emergency services began to
experiment with cumbersome radio
telephones.
 1940s: Mobile radio telephones started to
become popular with emergency services and
taxis.
 1946: AT&T and Southwestern Bell introduced
their Mobile Telephone System (MTS) for
sending radio calls between vehicles.
 1960s: Bell Laboratories (Bell Labs) developed
Metroliner mobile cellphones on trains.
 1973: Martin Cooper (1928–) of Motorola
made the first cellphone call using his 28-lb
prototype DynaTAC phone.
 1975: Cooper and his colleagues were granted
a patent for their radio telephone system. Their
original design is shown in the artwork you can
see here.
 1978: Analog Mobile Phone System (AMPS)
was introduced in Chicago by Illinois Bell and
AT&T.
 1982: European telephone companies agreed
a worldwide standard for how cellphones will
operate, which was named Groupe Speciale
Mobile and later Global System for Mobile
(GSM) telecommunications.
 1984: Motorola DynaTAC became the world's
first commercial handheld cellphone. Take a
look at a picture of Martin Cooper and his
DynaTAC.
 1995: GSM and a similar system called PCS
(Personal Communications Services) were
adopted in the United States.
 2001: GSM had captured over 70 percent of
the world cellphone market.
 2000s: Third-generation (3G and 3.5G)
cellphones were launched, featuring faster
networks, Internet access, music downloads,
and many more advanced features based on
digital technology.
 2007: Apple's iPhone revolutionized the world
of cellphones, packing what is effectively a
touch-controlled miniature computer into a
gadget the same since as a conventional
cellular phone.
 2013: Cellphones celebrate their 40th
anniversary.
 2020: Cellphone subscriptions reach 8.3
billion. About 80 percent of them are in
developing countries.


Sponsored links

Analog and digital


by Chris Woodford. Last updated: July 2, 2020.

B ack in the late 1970s, one of the most exciting things you could own

was a digital watch. Instead of trying to figure out the time from slowly
rotating hands, as you had to do with an old-style analog watch, you simply
read the numbers off a digital display. Since then, we've got more used to the
idea of digital technology. Now pretty much everything seems to be digital,
from television and radio to music players, cameras, cellphones, and even
books. What's the difference between analog and digital technology? Which is
best? Let's take a closer look!
Photo: Analog and digital technology: Above/left: This elegant Swiss watch shows the time with hands moving
round a dial. Below/right: Large digital clocks are quick and easy for runners to read. Photo by Jhi L. Scott
courtesy of US Navy.

Contents

1. What is analog technology?


2. What is digital technology?
3. Which is better, analog or digital?
4. What is sampling?
5. Find out more

What is analog technology?


People accept digital things easily enough, often by thinking of them
as electronic, computerized, and perhaps not even worth trying to understand.
But the concept of analog technology often seems more baffling—especially
when people try to explain it in pages like this. So what's it all about?

What does analog actually mean?

If you have an analog watch, it tells the time with hands that sweep around a
dial: the position of the hands is a measurement of the time. How much the
hands move is directly related to what time it is. So if the hour hand sweeps
across two segments of the dial, it's showing that twice as much time has
elapsed compared to if it had moved only one segment. That sounds
incredibly obvious, but it's much more subtle than it first seems. The point is
that the hand's movements over the dial are a way of representing passing
time. It's not the same thing as time itself: it's a representation or
an analogy of time. The same is true when you measure something with a
ruler. If you measure the length of your finger and mark it on the surface of a
wooden ruler, that little strip of wood or plastic you're looking at (a small
segment of the ruler) is the same length as your finger. It isn't your finger, of
course—it's a representation of your finger: another analogy. That's really
what the term analog means.

Photo: This dial thermometer shows temperature with a pointer and dial. If you prefer a more subtle definition, it
uses its pointer to show a representation (or analogy) of the temperature on the dial.

Analog measurements

Until computers started to dominate science and technology in the early


decades of the 20th century, virtually every measuring instrument was analog.
If you wanted to measure an electric current, you did it with a moving-coil
meter that had a little pointer moving over a dial. The more the pointer moved
up the dial, the higher the current in your circuit. The pointer was an analogy
of the current. All kinds of other measuring devices worked in a similar way,
from weighing machines and speedometers to sound-level meters and
seismographs (earthquake-plotting machines).

Analog information

However, analog technology isn't just about measuring things or using dials
and pointers. When we say something is analog, we often simply mean that
it's not digital: the job it does, or the information it handles, doesn't involve
processing numbers electronically. An old-style film camera is sometimes
referred to as example of analog technology. You capture an image on a
piece of transparent plastic "film" coated with silver-based chemicals, which
react to light. When the film is developed (chemically processed in a lab), it's
used to print a representation of the scene you photographed. In other words,
the picture you get is an analogy of the scene you wanted to record. The
same is true of recording sounds with an old-fashioned cassette recorder. The
recording you make is a collection of magnetized areas on a long reel of
plastic tape. Together, they represent an analogy of the sounds you originally
heard.

Sponsored links

What is digital technology?


Digital is entirely different. Instead of storing words, pictures, and sounds as
representations on things like plastic film or magnetic tape, we first convert the
information into numbers (digits) and display or store the numbers instead.

Digital measurements

Photo: A small LCD display on a pocket calculator. Most digital devices now use LCD displays like this, which
are cheap to manufacture and easy to read.

Many scientific instruments now measure things digitally (automatically


showing readings on LCD displays) instead of using analog pointers and
dials. Thermometers, blood-pressure meters, multimeters (for measuring
electric current and voltage), and bathroom scales are just a few of the
common measuring devices that are now likely to give you an instant digital
reading. Digital displays are generally quicker and easier to read than analog
ones; whether they're more accurate depends on how the measurement is
actually made and displayed.
Digital information

Photo: Ebooks owe their advantages to digital technology: they can store the equivalent of thousands of paper
books in a thin electronic device that fits in your book. Not only that, they can download digital books from the
Internet, which saves an analog trek to your local bookstore or library!

All kinds of everyday technology also works using digital rather than analog
technology. Cellphones, for example, transmit and receive calls by converting
the sounds of a person's voice into numbers and then sending the numbers
from one place to another in the form of radio waves. Used this way, digital
technology has many advantages. It's easier to store information in digital
form and it generally takes up less room. You'll need several shelves to store
400 vinyl, analog LP records, but with an MP3 player you can put the same
amount of music in your pocket! Electronic book (ebook) readers are similar:
typically, they can store a couple of thousand books—around 50 shelves
worth—in a space smaller than a single paperback! Digital information is
generally more secure: cellphone conversations are encrypted before
transmission—something easy to do when information is in numeric form to
begin with. You can also edit and play about with digital information very
easily. Few of us are talented enough to redraw a picture by Rembrandt or
Leonardo in a slightly different style. But anyone can edit a photo (in digital
form) in a computer graphics program, which works by manipulating the
numbers that represent the image rather than the image itself.
Which is better, analog or digital?

Photo: An early analog computer from 1949: machines like this represented numbers with analog dials, levers,
belts, and gears rather than (digital) numbers stored in electronic memories. Picture courtesy of NASA on the
Commons.

Just because digital technology has advantages, that doesn't mean it's always
better than analog. An analog watch might be far more accurate than a digital
one if it uses a high-precision movement (gears and springs) to measure time
passing, and if it has a sweeping second hand it will represent the time more
precisely than a digital watch whose display shows only hours and minutes.
Surprisingly, analog watches can also keep time better than quartz ones: the
day-to-day variations in a mechanical, analog watch tend to cancel one
another out, while those in an electronic quartz watch tend to compound one
another (here's why). Generally, the most expensive watches in the world are
analog ones (of course, that's partly because people prefer the way they look),
though the world's most accurate atomic clocks show time with digital
displays.

One interesting question is whether information stored in digital form will last
as long as analog information. Museums still have paper documents (and
ones written on clay or stone) that are thousands of years old, but no-one has
the first email or cellphone conversation. Open any book on the history
of photography and you'll see reproductions of early photos taken by Niepce,
Daguerre, and Fox-Talbot. But you won't see any pictures of the first digital
photo: even though it was much more recent, probably no-one knows what it
was or who took it! Lots of people own and cherish plastic LP records that are
decades old, but no-one attaches the same importance to
disposable MP3 music files. A lot of information recorded on early computer
memory devices is completely impossible to read with newer computers; even
floppy disks, commonplace as recently as the mid-1990s, are impossible to
read on modern computers that no longer have built-in floppy drives.

That's why, though the future may be digital, analog technology will always
have its place!

What is sampling?
It's easy to convert analog information into digital: you do it every time you
make a digital photo, record sound on your computer, or speak over
a cellphone. The process is called analog-to-digital conversion (ADC) or,
more informally, sampling. Sampling simply means "measuring at regular
intervals"—and it's easiest to understand with an example.

Let's suppose I'm talking to you on my cellphone. The sound of my voice is


really waves of energy that travel through the air to the phone's microphone,
which converts them into electrical signals. The sound waves and the signals
are both continuously varying waveforms—they're analog information— and
they look like the upper graph in the diagram.
Artwork: Top: A crude analog sound wave. Middle: A low sampling rate produces a crude digital approximation
to the original wave. Bottom: Doubling the sampling rate produces a more accurate digital version of the wave,
but generates twice as much digital information (data) that we need to store and transmit.

A cellphone transmits sound in digital form, so those analog waves need to be


converted into numbers. How does that happen? A circuit inside the phone
called an analog to digital converter measures the size of the waves many
times each second and stores each measurement as a number. You can see
in the middle figure that I've turned the first graph into a very approximate bar
chart. If each bar represents one second of time, we can represent this chart
by nine numbers (one number for the height of each bar): 5-7-7-5-1-1-3-3-5.
So by sampling (measuring) the sound wave once per second, we've
successfully turned our analog sound wave into digital information. We could
send those numbers through the air as radio waves to another phone, which
would run the process in reverse and turn the numbers back into sound we
could hear.

But do you see the problem? Some information is going to get lost in the
process of converting the sound to digital and back again, because the
measurement I've made doesn't precisely capture the shape of the original
wave: it's only a crude approximation. What can I do about this? I could
make more measurements, by measuring the sound wave twice as often.
That means doubling what's called the sampling rate. Now, as you can see in
the bottom chart, I get twice as many measurements and my sound wave is
represented by eighteen numbers: 6-7-7-8-8-7-7-5-2-1-1-2-3-3-4-4-4-4. The
more I increase the sampling rate, the more accurate my digital representation
of the sound becomes—but the more digital information I create and the more
space I need to store it.

Sampling rate and bit rate

When you download digital music, you might be given the option of
downloading the same track at what are called different bit rates. Broadly
speaking, the bit rate is the amount of information captured each time the
music is sampled. So a higher bit rate means more information is captured
and the analog information is turned into digital information more accurately.
Higher-quality music tracks may have a higher bit rate, but the tracks will take
up far more space on your computer and take longer to download.

Typically, music is digitally converted for CDs and MP3 tracks with a sampling
rate of 44.1kHz (about 44,000 times per second). Why such a high rate? For
technical reasons that I won't go into here, the sampling rate needs to be
about twice the highest frequency of sound in your wave, and since human
hearing is limited to about 20kHz, that suggests we need a sampling rate of at
least 40kHz. The typical bit rate for MP3 tracks is around 128kbps (128,000
binary digits or bits per second), though higher quality tracks have a bit rate
between 128kbps and 256kbps (up to 256,000 bits per second).

Digital cameras

by Chris Woodford. Last updated: November 1, 2020.


D igital cameras give a whole new meaning to the idea of painting by

numbers. Unlike old-style film cameras, they capture and record images of the
world around us using digital technology. In other words, they store
photographs not as patterns of darkness and light but as long strings of
numbers. This has many advantages: it gives us instant photographs, allows
us to edit our pictures, and makes it easier for us to share photographs
using cell phones (mobile phones), e-mail, and web sites.
Photo: A typical low-cost digital camera. The circle is the lens; the rectangle above it is a xenon flash lamp. You
can see what this camera looks like inside in the photo lower down this page.

Contents

1. How ordinary film cameras work


2. How digital cameras work
3. How digital cameras use digital technology
4. Why digital cameras compress images
5. Turning ordinary photos into digital photos
6. Inside a digital camera
7. What are "mirrorless" cameras?
8. How do digital cameras compare with smartphone
cameras?
9. A brief history of photography
10. Find out more

How ordinary film cameras work


Photo: An old-style film camera from the late 1980s. The film loads in a spool on the right and winds across to
another spool on the left, passing in front of the lens on the way. When you take a photo, the shutter
lets light enter from the lens and expose the film. It's all very 19th-century compared to digital photography!

If you have an old-style camera, you'll know that it's useless without one vital
piece of equipment: a film. A film is a long spool of flexible plastic coated with
special chemicals (based on compounds of silver) that are sensitive to light.
To stop light spoiling the film, it is wrapped up inside a tough, light-proof
plastic cylinder—the thing you put in your camera.

When you want to take a photograph with a film camera, you have to press a
button. This operates a mechanism called the shutter, which makes a hole
(the aperture) open briefly at the front of the camera, allowing light to enter
through the lens (a thick piece of glass or plastic mounted on the front). The
light causes reactions to take place in the chemicals on the film, thus storing
the picture in front of you.

This isn't quite the end of the process, however. When the film is full, you
have to take it to a drugstore (chemist's) to have it developed. Usually, this
involves placing the film into a huge automated developing machine. The
machine opens up the film container, pulls out the film, and dips it in various
other chemicals to make your photos appear. This process turns the film into
a series of "negative" pictures—ghostly reverse versions of what you actually
saw. In a negative, the black areas look light and vice-versa and all the colors
look weird too because the negative stores them as their opposites. Once the
machine has made the negatives, it uses them to make prints (finished
versions) of your photos.

If you want to take only one or two photographs, all of this can be a bit of a
nuisance. Most people have found themselves wasting photographs simply to
"finish off the film." Often, you have to wait several days for your film to be
developed and your prints (the finished photographs) returned to you. It's no
wonder that digital photography has become very popular—because it solves
all these problems at a stroke.

(Incidentally, if you want to learn more about film cameras and traditional
photography, see our main article on how film cameras work.)

Sponsored links

How digital cameras work


Photo: A typical image sensor. The green rectangle in the center (about the size of a fingernail) is the light-
sensitive part; the gold wires coming off it connect it into the camera circuit.

Digital cameras look very much like ordinary film cameras but they work in a
completely different way. When you press the button to take a photograph
with a digital camera, an aperture opens at the front of the camera and light
streams in through the lens. So far, it's just the same as a film camera. From
this point on, however, everything is different. There is no film in a digital
camera. Instead, there is a piece of electronic equipment that captures the
incoming light rays and turns them into electrical signals. This light detector is
one of two types, either a charge-coupled device (CCD) or a CMOS image
sensor.

If you've ever looked at a television screen close up, you will have noticed that
the picture is made up of millions of tiny colored dots or squares called pixels.
Laptop LCD computer screens also make up their images using pixels,
although they are often much too small to see. In a television or computer
screen, electronic equipment switches all these colored pixels on and off very
quickly. Light from the screen travels out to your eyes and your brain is fooled
into see a large, moving picture.

In a digital camera, exactly the opposite happens. Light from the thing you are
photographing zooms into the camera lens. This incoming "picture" hits the
image sensor chip, which breaks it up into millions of pixels. The sensor
measures the color and brightness of each pixel and stores it as a number.
Your digital photograph is effectively an enormously long string of numbers
describing the exact details of each pixel it contains. You can read more about
how an image sensor produces a digital picture in our article on webcams.

How digital cameras use digital technology


Once a picture is stored in numeric form, you can do all kinds of things with it.
Plug your digital camera into your computer, and you can download the
images you've taken and load them into programs like PhotoShop to edit them
or jazz them up. Or you can upload them onto websites, email them to friends,
and so on. This is possible because your photographs are stored in digital
format and all kinds of other digital gadgets—everything from MP3-
playing iPods to cellphones and computers to photo printers—use digital
technology too. Digital is a kind of language that all electronic gadgets "speak"
today.

Photo: Digital cameras are much more convenient than film cameras. You can instantly see how the picture will
look from the LCD screen on the back. If your picture doesn't turn out okay, you can simply delete it and try
again. You can't do that with a film camera. Digital cameras mean photographers can be more creative and
experimental.

If you open up a digital photograph in a paint (image editing) program, you


can change it in all kinds of ways. A program like this works by adjusting the
numbers that represent each pixel of the image. So, if you click on a control
that makes the image 20 percent brighter, the program goes through all the
numbers for each pixel in turn and increases them by 20 percent. If you mirror
an image (flip it horizontally), the program reverses the sequence of the
numbers it stores so they run in the opposite direction. What you see on the
screen is the image changing as you edit or manipulate it. But what you don't
see is the paint program changing all the numbers in the background.

Some of these image-editing techniques are built into more sophisticated


digital cameras. You might have a camera that has an optical zoom and a
digital zoom. An optical zoom means that the lens moves in and out to make
the incoming image bigger or smaller when it hits the CCD. A digital zoom
means that the microchip inside the camera blows up the incoming image
without actually moving the lens. So, just like moving closer to a TV set, the
image degrades in quality. In short, optical zooms make images bigger and
just as clear, but digital zooms make images bigger and more blurred.

Why digital cameras compress images


Imagine for a moment that you're a CCD or CMOS image sensing chip. Look
out of a window and try to figure out how you would store details of the view
you can see. First, you'd have to divide the image into a grid of squares. So
you'd need to draw an imaginary grid on top of the window. Next, you'd have
to measure the color and brightness of each pixel in the grid. Finally, you'd
have to write all these measurements down as numbers. If you measured the
color and brightness for six million pixels and wrote both down both things as
numbers, you'd end up with a string of millions of numbers—just to store one
photograph! This is why high-quality digital images often make enormous files
on your computer. Each one can be several megabytes (millions of
characters) in size.

To get around this, digital cameras, computers, and other digital gadgets use
a technique called compression. Compression is a mathematical trick that
involves squeezing digital photos so they can be stored with fewer numbers
and less memory. One popular form of compression is called JPG
(pronounced J-PEG, which stands for Joint Photographic Experts Group, after
the scientists and mathematicians who thought up the idea). JPG is known as
a "lossy" compression because, when photographs are squeezed this way,
some information is lost and can never be restored. High-resolution JPGs use
lots of memory space and look very clear; low resolution JPGs use much less
space and look more blurred. You can find out more about compression in our
article on MP3 players.

Most digital cameras have settings that let you take pictures at higher or lower
resolutions. If you select high-resolution, the camera can store fewer images
on its memory card—but they are much better quality. Opt for low-resolution
and you will get more images, but the quality won't be as good. Low-resolution
images are stored with greater compression.

Turning ordinary photos into digital photos


There is a way to turn photos from an ordinary film camera into digital photos
—by scanning them. A scanner is a piece of computer equipment that looks
like a small photocopier but works like a digital camera. When you put your
photos in a scanner, a light scans across them, turning them into strings of
pixels and thus into digital images you can see on your computer.

Inside a digital camera


Ever wondered what's inside a digital camera? What takes the photo?
Where's it stored? What makes the flash work? And how do all these bits
connect together? When you take electronic gadgets apart, they're much
harder to understand than ordinary machines (things that work through a clear
physical mechanism): you can't always see which part does which job or how.
Even so, it can be quite illuminating to peer into your favorite gadgets to see
what's hiding inside. I don't recommend you try this at home: opening things
up is the quickest way to invalidate your warranty; it's also a good way to
ensure they'll never work again!

The main parts of a digital camera

Photo: The parts in a basic digital camera. Were it not for the LCD screen and batteries (the two biggest
components), you could probably make a camera like this as small as a postage stamp!
I've opened up the camera in our top photo—and these are the parts I've
found inside:

1. Battery compartment: This camera takes two


1.5-volt batteries, so it runs on a total voltage
of 3 volts (3 V).
2. Flash capacitor: The capacitor charges up for
several seconds to store enough energy to fire
the flash.
3. Flash lamp: Operated by the capacitor. It
takes a fair bit of energy to fire a xenon flash
like this, which is why a lot of indoor flash
photography quickly uses up your batteries.
4. LED: A small red LED (light-emitting diode)
indicates when the self-timer is operating, so
you can take photos of yourself more easily.
5. Lens: The lens catches light from the object
you're photographing and focuses it on the
CCD.
6. Focusing mechanism: This camera has a
simple switch-operated focus that toggles the
lens between two positions for taking either
close-ups or distant shots.
7. Image sensor: This is the light-detecting
microchip in a digital camera and it uses either
CCD or CMOS technology. You can't actually
see the chip in this photo, because it's directly
underneath the lens. But you can see what it
looks like in our article on webcams.
8. USB connector: Attach a USB cable here and
connect it to your computer to download the
photos you've taken. To your computer, your
camera looks like just another memory device
(like a hard drive).
9. SD (secure digital) card slot: You can slide
a flash memory card in here for storing more
photos. The camera has a very small internal
memory that will store photos too.
10. Processor chip: The camera's main
digital "brain". This controls all the camera's
functions. It's an example of an integrated
circuit.
11. Wrist connector: The strap that keeps
the camera securely tied to your wrist attaches
here.
12. Top case: Simply screws on top of the
bottom case shown here.

Another important part, not shown here, is the LCD display that shows you the
photos you've taken. It's mounted on the back of the electronic circuit board
so you can't see it in this photo.

What are "mirrorless" cameras?


There are effectively four different kinds of digital cameras. The simplest,
known as point-and-shoot, have a lens to capture light (which may or may
not zoom), an image sensor to turn the pattern of light into digital form, and
an LCD screen round the back for viewing your photos. At the opposite end of
the spectrum, DSLR (Digital Single Lens Reflex) cameras look like traditional,
professional film cameras and have a moving, hinged mirror inside that lets
you view the exact picture you're going to shoot through the lens (for an
explanation of how SLR works, see our article on film cameras). The most
recent innovation, mirrorless digital cameras, are a sort of hybrid of these two
designs: they abandon the hinged mirror system in favor of a higher-resolution
LCD viewfinder mounted nearer to the image sensor, which makes them
smaller, lighter, faster, and quieter. Finally, there are smartphone cameras,
which resemble point-and-shoot models but lack features like an optical zoom.

How do digital cameras compare with smartphone


cameras?
From what I've said so far, you can see that digital cameras are great things—
if you're comparing them to old-style film cameras, that is. Thanks to their
superb, cutting-edge image sensors, there's really no good reason (other than
a nostalgic preference for analog technology) to use film. You might be
forgiven for thinking sales of digital cameras would be rocketing as a result,
but you'd be wrong. Over the last few years, digital cameras have seen
double-digit falls in sales in parallel with the massive rise of smartphones and
tablets (which now sell more than 1.5 billion each year). Check out a photo-
sharing site like Flickr and you'll find the most popular "cameras" are actually
phones: in September 2019, at the time I'm updating this article, Flickr's top
five cameras are all iPhones. Is there a good reason to own a standalone
digital camera anymore or can you now do everything with a camera phone?

Photo: The pros and cons of digital cameras and smartphones summarized in three photos. Even point-and-
shoot digital cameras like my old Canon Ixus have bigger, better, telescopic lenses (top) and sensors
compared to the ones in the best smartphone cameras, like my new LG (middle). But smartphones
undoubtedly score on connectivity and they have bigger, better, and clearer screens (bottom). Here you can
see my smartphone's huge screen pictured in a preview photo on the Canon's tiny screen.

Sensors and screens

Step back a decade and there was no comparison at all between the rough
and clunky snapshot cameras on cellphones and even the most mediocre
compact digital cameras. While the digitals were boasting ever-increasing
numbers of megapixels, cellphones took crude snaps little better than the
ones you could get from a basic webcam (1 megapixel or less was common).
Now all that's changed. The 10-year-old Canon Ixus/Powershot digital camera
I use routinely is rated at 7.1 megapixels, which is perfectly fine for almost
anything I ever want to do. My new LG smartphone comes in at 13
megapixels, which (theoretically, at least) sounds like it must be twice as
good.

But wait! "Megapixels" are a misleading marketing ploy: what really matters is
the size and quality of the image sensors themselves. Generally, the bigger
the sensor, the better the pictures. Comparing the raw technical data, the
Canon Ixus claims a 1/2.5" CCD while the LG has a 1/3.06" CMOS (a
newer, somewhat different type of sensor chip). What do those numbers
actually mean? Sensor measurements are based on needlessly confusing
math that I'm not going to explain here, and you'll have take it on trust that
both of these cameras have tiny sensors, about half the size of a pinkie nail
(measuring less than 5mm in each direction), though the Canon sensor is
significantly bigger. The Digital Ixus, though eight years older than the LG
smartphone, and with apparently half as many "megapixels," has a
significantly bigger sensor chip and one that's likely to outperform the LG,
especially in lower light conditions.

The Canon also scores with a much better, telescopic lens (technically rated
5.8–17.4 mm, which is equivalent to 35–105mm)—better quality and
telescopic to boot—that can take everything from infinity-distance landscapes
to close-up macro shots of spiders and flies. But I have to upload my photos
to a computer to get a sense of how good or bad they are because the Canon
only has a tiny 6cm (2.5-inch) LCD screen. The LG is over twice as good on
the diagonal screen dimension, with a 14cm (5.5 inch) "monitor." Where
Canon estimates that the Ixus screen has 230,000 pixels, the LG boasts quad
HD (2560×1440 pixels), or roughly sixteen times more. I might not be able to
take better photos with the LG, but at least I can instantly assess and
appreciate them on a screen as good as an HD TV (albeit still pocket-sized).

Bear in mind that my Canon is just a point-and-shoot compact, so this is not


really a fair comparison between what you can achieve with a really good
digital camera and a really good smartphone. My LG is right up at the better
end of smartphone cameras, but the Ixus isn't anywhere near as good as the
best digital cameras. A professional DSLR would have a much bigger sensor
than a smartphone—up to 3.6cm × 2.4cm—so it would be able to capture
really fine detail in even the lowest of light levels. It would also have a bigger
and better screen and better (interchangeable) lenses.
Photo: This is a closeup of the camera inside the LG (with its cover popped off). What you're looking at here is
the lens: the image sensor chip is directly underneath it. (In case it's not clear, the red thing is a pen I'm
pointing with.)

Social media

Of course, where smartphone cameras really score is in the "smartphone"


department: they're computers, in essence, that are pop-in-the-pocket
portable and always online. So not only are you more likely to capture chance
photos (because you're always carrying a camera), but you can instantly
upload your snaps to the aptly-named Instagram, Facebook, or Twitter. And
that's the real reason why smartphone cameras have surpassed old-school
digitals: photography itself has changed from the digital-equivalent of the 19th-
century Daguerreotype (itself a throwback to the portrait paintings of old) to
something more off-the-cuff, immediate, and, of course, social. For the
purposes of Facebook or Twitter, often viewed on small-screen mobile
devices, you don't need more than a couple of megapixels, at most. (Prove it
yourself by downloading a hi-res image from Instagram or Flickr, and you'll
find it's seldom more than a couple of hundred kilobytes in size and 1000
megapixels or less in each dimension, making less than one megapixel in
total.) Even on better photo-sharing websites like Instagram and Flickr, most
people will never be browsing your photos in multi-megapixel dimensions:
they simply wouldn't fit on the screen. So even if your smartphone doesn't
have masses of megapixels, it doesn't really matter: most people flicking
through your photos on their smartphones won't notice—or care. Social media
means never having to say you're sorry you forgot your DSLR and only had
your iPhone!

Smartphone add-ons

Now it's absolutely the case that photos taken with a top-notch Canon or
Nikon DSLR will beat, hands down, snapshots from even the best
smartphones—but that's often because it's not a like-for-like comparison.
Often, we're comparing good amateur photos taken with smartphones to
brilliant professional photos taken with DSLRs. How much of what we're
seeing is the camera... and how much the eye of the photographer?
Sometimes it's hard to separate the two things

Professionals can achieve amazing results with smartphones—but so can


amateurs, with a bit of extra help. One of the drawbacks of smartphone
cameras is the lack of manual control (generally even less than with a basic
compact digital camera). You can get around that, to a certain extent, by using
add-on apps that give you much more control over fiddly, old-school settings
like ISO, aperture, shutter speed, and white balance. (Search your favorite
app store for keywords like "professional photography" or "manual
photography".) You can also add snap-on lenses to smartphones to get
around the drawbacks of a fixed-focal-length lens (though there's nothing you
can do about the tiny, poorer-quality image sensor). Once your photos are
safely snapped, there are plenty of photo-editing apps for smartphones as
well, including a slimmed-down, free version of PhotoShop, which can help
you retouch your amateur "sow's ears" into professional "silk purses."

So why still buy digital?

Since many people now own a smartphone, the real question is whether you
need a digital camera as well. It's very hard to see an argument for point-and-
shoot compacts anymore: for social-media snaps, most of us can get by with
our phones. For this website, I take a lot of macro photos—close-ups of
circuits and mechanical parts—with my Ixus that I couldn't possibly capture
with the LG, so I won't be jumping ship anytime soon.

If you want to take professional quality photos, there's really no comparison


between smartphones and DSLRs. A top-notch DSLR has a better-quality
image sensor (up to 50 times bigger in area than the one in a smartphone)
and a much better lens: these two fundamentally important things make the
"raw" image from a DSLR far better. Add in all those fiddly manual controls
you have on a DSLR and you'll be able to capture a far greater range of
photos across a far wider range of lighting conditions. If you really care about
the quality of your photos, instant-uploading to sharing sites might be a less
important consideration: you'll want to view your photos on a big monitor,
retouch them, and only share them when you're happy. Having said that, you
can now buy hybrid digital cameras with built-in Wi-Fi that offer similar instant-
sharing convenience to smartphones. And, of course, there's nothing to stop
you carrying a smartphone and a DSLR if you really want the best of both
worlds!

A brief history of photography

Artwork: The original digital camera, invented in the 1970s by Steven Sasson, worked a bit like an old-style
camcorder and needed a separate playback monitor. First (top), you took your photos with the camera (blue),
which used a CCD to record them onto a magnetic tape (red). Later (bottom), when you got back home, you
took out the tape, inserted it into a computer (orange), and viewed the pictures you'd taken on a computer
monitor or TV (green). Artwork from US Patent 4,131,919: Electronic still camera by Gareth A. Lloyd, Steven J.
Sasson courtesy of US Patent & Trademark Office.

 4th century BCE: Chinese invented the


camera obscura (a darkened room with a hole
in the drapes that projects an image of the
outside world onto a distant wall).
 Late 1700s: Thomas Wedgwood (1771-1805)
and Sir Humphry Davy (1778–1829), two
English scientists, carried out early
experiments trying to record images on light-
sensitive paper. Their photos were not
permanent: they turned black unless
permanently stored in a dark place.
 1827: French Joseph Nicéphore
Niépce (1765–1833) made the world's first
photographs. His method was no good for
taking portraits of people because the camera
shutter had to be left open for eight hours.
 1839: French opera-house scene
painter Louis Daguerre (1787–1851)
announced the invention of photos on silver
plates that became known as daguerreotypes.
 1839: William Henry Fox Talbot (1800–1877)
invented the photographic negative process.
 1851: British artist and
photographer Frederick Scott Archer (1813–
1857) invented a way of taking pin-sharp
photos onto wet glass plates.
 1870s: British physician Dr Richard
Maddox (1816–1902) developed a way of
taking photos using dry plates and gelatin.
 1883: American inventor George
Eastman (1854–1932) invented the modern
photographic film.
 1888: George Eastman launched his easy-to-
use Kodak camera. His slogan was: "You push
the button and we do the rest."
 1947: Edwin Land (1909–1991) invented the
instant polaroid camera.
 1963: Edwin Land invented the color polaroid
camera.
 1975: US electrical engineer Steven
Sasson invented the first CCD-based
electronic camera with Gareth Lloyd at
Eastman Kodak.
 1990s: Digital cameras started to become
popular, gradually making film cameras
obsolete.
 2000s: Advanced cellphones with built-in
digital cameras began to make standalone
digital cameras redundant for everyday
snapshot photography.

Digital pens

by Chris Woodford. Last updated: January 8, 2021.

H as there ever been a more amazing invention than the pen—an

incredibly convenient way of recording information that dates back thousands


of years? The only trouble is, pens and paper are not very compatible with the
digital technology that surrounds us in the modern world. It's all very well
scribbling little notes to yourself as you sit on the train, but what if you need to
put that information into your computer when you get home? Until recently,
your only option would have been to read back your notes and type in the
information (that is, write it out all over again)—but now there's a better
solution: the digital pen. Digital pens look like fatter versions of ordinary pens.
Packed with electronic circuits, optical devices, and Bluetooth® gizmos, they
can record the things you write as you write them and transmit them
automatically to your computer using wireless technology. Sounds amazing
doesn't it, so how exactly does it all work?
Photo: A Nokia SU-27W digital pen. It's about four times fatter than a fountain pen, a little bit longer, but not all
that much heavier. This one is no longer available, but there are plenty of similar ones on the market.

Contents

1. The digital desktop


2. What's different about a digital pen?
3. How a digital pen works
4. What can you use digital pens for?
5. Will digital pens ever catch on?
6. What's inside a digital pen?
7. Find out more

The digital desktop

Photo: An optical computer mouse has much in common with a digital pen. Turn it over and you can see the
light that shines onto your desk and the photocell that picks up its reflection.

Chances are you already own something quite like a digital pen. If you have
an optical mouse (one that works by shining light onto your desk instead of
using a heavy, rolling, rubber ball), you're already using most of the
technology that a digital pen uses. If you lift up an optical mouse, you'll see
there are two optical components underneath: one that shines red light down
onto your desk and another one that detects the light as it bounces back up
again. The light is produced by a light-emitting diode (LED); right next to it,
there's a photoelectric cell—a component that detects the reflected LED light
and turns it back into an electrical signal. As you move your mouse around,
the pattern of red light reflected off the desk changes from one moment to the
next and the circuits inside the mouse use this to figure out exactly how you're
moving your hand.
Now, clearly, you could write words with your optical mouse if you wanted to
and they would appear on your computer screen—but they'd appear as big,
fat, smudgy images not as clearly discernible words: your computer would
have no idea what you'd actually written and it would be impossible to import
your scribbles into a word-processor to edit them.

Sponsored links

What's different about a digital pen?


If you look inside a digital pen, you'll find most of the same components that
are in an optical mouse. The difference is that they're stacked vertically rather
than horizontally: a digital pen is to an optical mouse what a skyscraper is to a
parking lot. Where an optical mouse tracks your hand movements by
reflecting light off your desktop, a digital pen does the same thing much more
precisely by following an almost invisible pattern of lines or pinpoints
(depending on which system you use) on special paper.

A mouse doesn't keep a track of what you do, but a digital pen does: it tracks
its progress across the paper as you move it around and, in this way, captures
what you write. So that you can see exactly what you're doing, a digital pen
also has a conventional refill that leaves an ink trail, just like a normal pen.
The ink trail is purely for your convenience: the computer doesn't "see" it or
use it in any way. Every so often, you need to upload your writing to your
computer. Some digital pens upload when you plug them into a computer with
a USB cable, others upload through a docking station that also charges
the battery in the pen, while the most sophisticated ones can also transmit
words as you write them using a wireless technology such as infrared or
Bluetooth

How a digital pen works


Move your digital pen across the special paper and this is what happens:
1. The ink refill leaves an ink trail on the page.
You can see this but the pen can't.
2. The infrared LED in the base of the pen
shines onto the page. You can't see it
because your eyes can't detect infrared.
3. The light detector, also in the base of the
pen, picks up the infrared reflected off
recognition marks printed on the special
paper.
4. The microchip in the pen uses the pattern of
reflections to store images of the words you're
writing
5. The Bluetooth antenna built into the pen
transmits the stored data wirelessly and
invisibly through the air.
6. The wireless receiver in your computer picks
up the Bluetooth signals and stores what
you've written. Software in the PC converts
this data into normal, editable text you can
import into a word-processing program.

What can you use digital pens for?


Digital pens aren't all the same. There are three quite different kinds and they
do three quite different jobs:
1. Some are like thin, handheld scanners.
They're designed to turn printed text into
editable text on your computer using OCR
(optical character recognition). IRISPen is a
popular example.
2. Some are designed to "import" ordinary
handwriting into a computer as editable text.
Pens like this come with a PC software
package that imports the data the pen has
stored and decodes it, turning your scribbled
handwriting into editable text as good as you
could have typed from the keyboard.
3. Some work by reading or tracking complex
printed patterns from the paper and are mainly
used for filling in order forms (though they can
also do things like handwriting recognition).

Scanning pens

It's a bit of a misnomer to call these digital "pens," since they're essentially just
text scanners and don't actually write anything; you'll sometimes see them
described as "pen scanners" or "OCR pens." Some are battery powered and
have onboard flash memories to store things you scan as you're out and
about; you simply upload what you've scanned when you get back to your
computer. Others have long USB cables, so they work exactly like
conventional scanners but are a bit more portable, if you hook them up to a
laptop.

Handwriting pens

More sophisticated digital pens are designed to capture your handwriting.


Now, if you're a fan of old-style technology, particularly classic technology like
the pen and paper, digital pens might seem completely frivolous—but just
consider for a moment how useful they could be in certain situations. If you're
a student taking notes in classes or lectures, imagine how brilliant it would be
to get back to your room, immediately upload all your notes to your computer
and instantly print them out in neat, typed form. Or, if you're a physician
(doctor), wouldn't it be handy if all the notes you scribbled about a patient
during an examination could be instantly uploaded onto their records as soon
as they left your consulting room?

Paper-tracking pens
Artwork: Some digital pens read writing with help from an almost invisible grid of tracking marks. In practice, the
grid uses a more irregular pattern and is less visible than this.

Digital pens have some pretty cunning new uses as well. The company that
devised much of the technology behind the latest generation of pens, Anoto,
envisages them as a super-convenient way of ordering information from
websites. Their brainwave was not to produce better digital pens but to
reinvent paper so that it's overlaid with an extraordinarily complex, almost
invisible pattern that's easy to vary for different purposes. So, for example, an
election voting paper would have a different pattern from a mail-order catalog
ordering form, and a mail-order form printed by Sears would be different to
one printed by Macy's.

Imagine if you wanted to order a Chinese take-away through a website. It can


be quite irritating to have to switch on your computer, wait for it to boot up, go
online, fill in one of those lengthy forms, enter all your payment details, and
finally wait for your food to arrive. It's so much quicker to do that by phone or
on paper. So Anoto's idea is that takeaways (and other companies using
online ordering) would print their catalogs or menus with their own unique
version of its specially coded paper. People could then tick the things they
wanted with their digital pens. Because of the unique pattern, the pen would
instantly know which company website the form referred to and send the
orders through to the correct place in a fraction of the time.
Photo: Anoto's digital pen (shown here in their original patent illustration) looks very much like the Nokia one
I've taken apart in this article. Using their numbering (but with my colors added for clarity): 11 (gray) is the pen
casing; 12 is the opening at the bottom of the pen through which light fires down at the paper and back up
again; 13 (red) is an LED; 14 (yellow) is a light sensor (either a CCD or CMOS sensor); 16 (green) is the main
circuit board; 20 (brown) is a digital display; 18 (purple) are buttons for switching the pen on and off or
controlling simple menu functions; 15 (orange) is the battery; 19 (blue) is the wireless transmitter. Artwork
from US Patent 6,502,756: Recording of information by Christer Fåhraeus, Anoto Ab, 7 January 2003, courtesy
of US Patent and Trademark Office.

Will digital pens ever catch on?


When I first wrote this article, back in 2008, Anoto was still quite a new
technology, and I commented: "Given that it marries the simplicity and
convenience of pens with the power of computers, it could have a very
promising future." Further back, in 2002, Wired magazine had noted that
"some observers believe digital pens will make traditional writing on paper
obsolete by 2020." Looking back now, it's well over a decade since Anoto was
granted its patents and the system is still relatively uncommon. Smartphones
with intuitively easy-to-use touchscreens that automatically know where they
are, faster mobile networks, and very usable apps have combined to make
online ordering much quicker and simpler than ever before. Most of us own a
smartphone and order from it all the time; how many of us own a digital pen or
have ever seen digital paper? Meanwhile, bigger-screen tablets and phablets
have come along, replacing paper notebooks altogether for many people. If
you're happy writing notes straight on your tablet, why bother with digital
paper at all? Handwriting apps like INKredible can replicate much of the
elegance of real-world handwriting, even when you write with your finger. Did
Anoto, obsessed with finding markets for a very clever digital pen-and-paper
system, simply fail to understand how touchscreens would come to dominate
the world? Was it way ahead of its time? Or a flawed idea whose time would
never come?

The digital pen makers don't give up easily! Livescribe (bought by Anoto in
2015) is also targeting the corporate market, while Leapfrog packages similar
technology in a colorfully chunky pen designed to help children learn to read.
The latest generation of handwriting pens (like the Neo Smartpen, Moleskin+
Pen, Montblanc StarWalker, and Livescribe) are specifically geared for use
with smartphones and come with their own dedicated iOS and Android apps.
All these systems are based on coded paper.

Meanwhile, Microsoft has been taking a slightly different tack,


quietly reinventing digital pens as faster, more intuitive interfaces to its
Surface range of touchscreen laptops and tablets. Instead of worrying about
the nitty gritty of data collection and processing, or figuring out how to get
marks made on paper into digital devices, Microsoft's Surface Pens focus
squarely on maximum usability by replicating the responsiveness and
familiarity of ordinary pens and pencils. So you can tilt them for different
effects and shade with them, just like you can with a pencil, and the "ink"
appears on the screen immediately, just as ink appears from a ballpoint, and
not lagging slightly behind your movements, as it often does with earlier,
clumsier digital pens.

What's inside a digital pen?


Take a Nokia digital pen apart and this is what you'll find:

1. Pen cap: Nothing hi tech about this. It just


keeps the ink off your clothes and protects the
light detector.
2. Ordinary ink refill: Leaves a trail on the
paper so you know what you've written.
3. Docking connector: When the pen sits in its
docking cradle (not shown), this connector
charges the battery and downloads your
words, via the docking station, to your
computer.
4. Pen optics: This compartment holds the most
important parts of the pen: the LED light that
shines onto the paper and the photocell that
detects the reflected light. Unlike an optical
mouse, a digital pen uses invisible infrared, so
you can't actually see the light it uses.
5. Refill holder: It's as low-tech as it sounds: it's
a simple piece of plastic that holds the ink refill
in place.
6. Indicator lights: These shine up through the
pen case to tell you when the pen needs
recharging, when it's full of words, and so on.
7. Reset button: You can push a tiny little
rubber button on the base of the pen case to
reset it.
8. Rechargeable battery: This should last a few
years at least.
9. Vibrating motor: Have you ever wondered
how cellphones (mobile phones), pagers, and
other mobile devices give you one of those
vibrating alerts? Here's the answer. It's a
tiny electric motor with a wonky bit of metal on
the end. As the motor spins, the wonky metal
wobbles around on the end making the whole
thing shake like a badly loaded washing
machine.
10. Indicator lights on pen top: The lights
on the circuit board shine up through
transparent areas on the plastic pen case.

Photo: Antenna: Look at the back of the circuit board and you'll see the tiny little Bluetooth antenna (aerial) that
transmits your words to your computer. It's a little bit thicker than a pin. There are some more chips round the
back too.

How printing works


by Chris Woodford. Last updated: May 16, 2021.

I t's amazing that you're sitting reading these words at your computer; back

in the 15th century, it would have been just as amazing to be reading them in
a book. That was when printing technology hit the big time and the invention
of the modern printing press made it possible for books to be reproduced in
their hundreds and thousands instead of being copied out laboriously, one at
a time, by hand. Although newspapers, books, and all kinds of other printed
materials are now shifting online, printing is just as important today as it's ever
been. Look around your room right now and you'll see all kinds of printed
things, from the stickers on your computer to the T-shirt on your back and the
posters on your wall. So how exactly does printing work? Let's take a closer
look!
Photo: Potato printing: This is printing the way most of us learn it. It's an example of relief printing in which the
ink is applied to a raised surface (the parts of the potato surface that haven't been cut away) before the paper
is pressed onto it.

Contents

1. What is printing?
2. Types of printing
o Relief printing
o Gravure printing
o Offset printing
3. Other types of printing
4. Black and white, grayscale, and color printing
5. Who invented printing?
6. Find out more
What is printing?
Printing means reproducing words or images on paper, card, plastic, fabric, or
another material. It can involve anything from making a single reproduction of
a priceless painting to running off millions of copies of the latest Harry Potter.
Why is it called printing? The word "printing" ultimately comes a Latin
word, premĕre, which means to press; just about every type of printing
involves pressing one thing against another.

Photo: A typical old-fashioned, wooden printing press, as used by none other than Benjamin Franklin around
1730. Photo from Carol M. Highsmith's America Project in the Carol M. Highsmith Archive, courtesy US Library
of Congress.

Although there are many different variations, typically printing involves


converting your original words or artwork into a printable form, called
a printing plate, which is covered in ink and then pressed against pieces of
paper, card, fabric, or whatever so they become faithful reproductions of the
original. Some popular forms of printing, such
as photocopying and inkjet and laser printing, work by transferring ink to paper
using heat or static electricity and we won't discuss them here; the rest of this
article is devoted to traditional printing with presses and ink.

Printing is hard, physical work so it's usually done with the help of
a machine called a printing press. The simplest (and oldest) kind of press is
a large table fitted with an overhead screw and lever mechanism that forces
the printing plate firmly against the paper. Hand-operated presses like this are
still occasionally used to produce small volumes of printed materials. At the
other end of the scale, modern presses used to print books, newspapers, and
magazines use cylinder mechanisms rotating at high-speed to produce
thousands of copies an hour.

Animation: How a traditional printing press works. 1) You put the original item you want to print from (typically
metal type, black) face up on a table (light gray) and cover it evenly with ink (blue). You put the paper you want
to print onto in a wooden frame and slide it along the table under the press. 2) The press consists of two blocks
(dark and light gray) held together by a screw mechanism supported by a sturdy wooden frame (brown). 3)
When you turn the lever, the lower block, called the platen (4), screws downward and presses the frame,
containing the paper, tightly and evenly onto the inked type (5). Finally, you loosen the screw and remove the
printed paper from the frame.

Sponsored links

Types of printing
The three most common methods of printing are called relief (or letterpress),
gravure (or intaglio), and offset. All three involve transferring ink from a
printing plate to whatever is being printed, but each one works in a slightly
different way. First, we'll compare the three methods with a quick overview
and then we'll look at each one in much more detail.

 Relief is the most familiar kind of printing. If


you've ever made a potato print or used an
old-fashioned typewriter, you've used relief
printing. The basic idea is that you make a
reversed, sticking-up (relief) version of
whatever you want to print on the surface of
the printing plate and simply cover it with ink.
Because the printing surface is above the rest
of the plate, only this part (and not the
background) picks up any ink. Push the inked
plate against the paper (or whatever you're
printing) and a right-way-round printed copy
instantly appears.
 Gravure is the exact opposite of relief printing.
Instead of making a raised printing area on the
plate, you dig or scrape an image into it (a bit
like digging a grave, hence the name gravure).
When you want to print from the plate, you
coat it with ink so the ink fills up the places
you've dug out. Then you wipe the plate clean
so the ink is removed from the surface but left
in the depressions you've carved out. Finally,
you press the plate hard against the paper (or
other material you're printing) so the paper is
pushed into the inky depressions, picking up a
pattern only from those places.
 Offset printing also transfers ink from a
printing plate onto paper (or another material),
but instead of the plate pressing directly
against the paper, there is an extra step
involved. The inked plate presses onto a soft
roller, transferring the printed image onto it,
and then the roller presses against the printing
surface—so instead of the press directly
printing the surface, the printed image is first
offset to the roller and only then transferred
across. Offset printing stops the printing plate
from wearing out through repeated
impressions on the paper, and produces
consistently higher quality prints.
Photo: The three most common types of printing: Left: Relief—Raised parts of the printing block (gray) transfer
the ink (red) to the paper (white rectangle with black outline at the top) when the two are pressed together.
Middle: Gravure—Grooves dug into the printing block transfer the ink to the paper when the paper is pressed
tightly into them. Right: Offset—A rotating cylinder (blue) transfers ink from the printing plate to the paper
without the two ever coming into contact.

Relief printing

For over 500 years now, most high-volume, low-quality printed material has
been produced with letterpress machines, which are more or less
sophisticated versions of the printing press Johannes Gutenberg invented
back in the 15th century. In the simplest kind of letterpress, known as
a flatbed press or platen press, the paper is supported on a flat metal plate
called the platen, which sits underneath a second flat plate holding a relief
version of the item to be printed (the printing plate, in other words). The
printing plate is covered with ink (either by hand, with a brush or by an
automated roller) before the paper is pressed tightly against it and then
released. The process can be repeated any number of times.

Photo: The keys in an old-fashioned typewriter produce images on paper by relief printing. When you press a
key, these metal type letters flip up and press a piece of inked fabric against the paper. The letters are cast in
reverse so, when they hit the paper, the printed impression comes out the right way round. Typewriters like this
are now largely obsolete, but great fun to use—if a little noisy—when you can find them!

Flatbed presses are generally the slowest of all printing methods, because it
takes time to keep lifting and inking the printing plate and loading and
removing sheets of paper. That's why most letterpresses use rotating
cylinders in place of one or both of the flat beds. In one type of machine,
known as a flatbed cylinder press, the printing plate is mounted on a flat bed
that shifts back and forth as a cylinder moves past it, inking it, pressing the
paper against it, and then lifting the printed paper clear again. That speeds up
printing considerably, but loading and removing the paper is still a slow
process. The fastest letterpresses, known as rotary webfed presses, have
curved printing plates wrapped around spinning metal cylinders, which they
press against paper that feeds automatically from huge rolls called webs.
Newspapers are printed on machines like this, which typically print both sides
of the paper at once and can produce thousands of copies per hour.

Gravure printing

The simplest kind of gravure printing is engraving, in which an artist draws a


picture by scratching lightly on the surface of a copper plate that has been
thoroughly coated with an acid-resistant chemical. Lines of shiny copper are
revealed as the artist scrapes away. The plate is then dipped in acid so the
exposed copper lines are etched (eaten much deeper into the metal) by the
acid, while the rest of the plate remains unchanged. The acid-resistant
chemical is then washed off leaving a copper printing plate, from which a
number of copies, called etchings, can be printed. Traditional engraving and
etching is quite a laborious process, so it's used mainly by artists to produce
relatively small volumes of (originally) hand-drawn pictures.

A similar but much quicker and more efficient process called photogravure is
used commercially to produce large volumes of high-quality prints. Instead of
being slowly and painstakingly drawn, the image to be printed is
transferred photographically onto the copper printing plate ("photo") and then
etched into it ("gravure"). Once the plate has been produced, it's used to make
prints on either a flatbed press (fed with single printed sheets) or a rotary web
press known as a rotogravure machine. Glossy magazines and cardboard
packaging containers are often printed this way.

Offset printing
The most common type of printing today uses a method called offset
lithography (typically shortened to "offset litho"), which is a whole lot simpler
than it sounds. As we've already seen, offset simply means that the printing
plate doesn't directly touch the final printed surface (the paper or whatever it
might be); instead, an intermediate roller is used to transfer the printed image
from one to the other. But what about lithography?

Photo: A modern offset printing press used to produce small runs of a weekly newspaper. Note the final printed
copy on the top roller and the offset cylinder in the middle just underneath it. Photo by Senior Airman Dilia
DeGrego courtesy of US Air Force.

Lithography literally means "stone-writing," a method of printing from the


surface of stones that was invented in 1798 by German actor and playwright
Alois Senefelder. He took a large stone and drew a design on it with a wax
crayon. Then he dipped the stone in water so the parts of the stone not
covered in crayon became wet. Next, he dipped the design in ink, so the ink
stuck only to the waxed parts of the stone and not the wet parts. So now he
had an inked "printed plate" (or printing stone, if you prefer) that he could
press against paper to make a copy. Lithography avoids the need to make a
traditional printing plate, as you need for both relief and gravure printing.
Photo: A small offset printing press. Note the paper sheets feeding in from the left and the rollers that transfer
the paper and copy the image. Photo by J. Pond courtesy of Defense Imagery.

Modern offset lithography printing presses use an updated version of the


same basic idea in which the stone is replaced with a thin metal printing plate.
First, the image to be printed is transferred photographically to the plate. The
parts of the plate from which the image is printed are coated with lacquer
(clear varnish), so they attract ink, while the rest of the plate is coated with
gum, so it attracts water. The metal plates are curved around a printing
cylinder and press against a series of rollers, which dampen them with water
and then brush them with ink. Only the lacquered parts of the plate (those that
will print) pick up ink. The inked plate presses against a soft rubber (offset)
cylinder, known as the blanket cylinder, and transfers its image across. The
blanket cylinder then presses against the paper and makes the final print.
High-speed offset lithography presses are web-fed (from paper cylinders) and
can produce something like 20km (~12 miles) of printed material in an hour!

Other types of printing


Relief, gravure, and offset are used to print the overwhelming majority of
books, magazines, posters, headed stationery, and other printed materials
that surround us, but several other methods are used for printing other things.
T-shirt designs, for example, are usually produced with a process called silk-
screen printing (sometimes called serigraphy). This involves covering the
article to be printed (something like a blank cotton shirt) with a mesh-screen
and a stencil, then wiping ink over the mesh with a brush. Ink transfers
through the mesh to the fabric below except where it's blocked from doing so
by the pattern on the stencil. Collotype (also called photographic gelatin) is a
less commonplace technique in which a gelatin-coated printing plate is made
from a high-quality original using a kind of photographic method. It produces
finely detailed reproductions and is still used for making high-quality prints of
paintings.

Black and white, grayscale, and color printing

Photo: Halftones: Here's the same photo up above as a newspaper might print it using different sized areas of
black ink. If you squint, or look from a distance, you can see that it looks like it's been printed with many
different shades of gray, even though it's really using only one color of ink (black). In practice, newspapers use
much smaller dots than this—we've exaggerated greatly so you can see how it works. Photo by Senior Airman
Dilia DeGrego courtesy of US Air Force, with simulated halftone treatment by explainthatstuff.

Traditionally, printing presses used a single color of ink (black) to produce


basic black-and-white text, but printing photographs and artworks was much
more difficult because they really needed to be printed either with many colors
or many shades of gray. That problem was solved when people discovered
how to simulate shades of gray using what's called the halftone method. It's a
simple way of converting photographs and drawings into images made from
tiny black dots of differing sizes to give the impression they're made from
many different shades of gray. In other words, it's a way of making a
convincing gray-scale image using only black ink, and it relies on fooling your
eyes through an optical illusion.

To print in full color, you need to use at least four different inks: three primary
ink colors and black. Most people know that you can produce light of any color
by adding together different amounts of red, green, and blue light; that's how
a television or LCD computer screen works. Colored inks work in a different
way by subtracting color: they absorb some of the light that falls on them and
reflect the rest into our eyes—so the color they appear is effectively
subtracted from the original, incoming light. If you have an inkjet printer with
replaceable cartridges, you'll know that you can print any color on white paper
using the three colors cyan (a kind of turquoise blue), magenta (a reddish
purple), and yellow. Theoretically, you can produce black with equal amounts
of cyan, magenta, and yellow, but in practice you need a fourth ink as well to
produce a deep convincing black. That's why full-color printing is often
referred to as the four-color process, sometimes as CMYK printing (Cyan,
Magenta, Yellow, and K meaning "key," a printer's word that usually means
black), and sometimes (since each color has to be printed separately)
as color-separation printing. Just like with black-and-white printing, the
halftone process can also be used to create varying shades of color.

Photo: Color printing: With black, magenta, cyan, and yellow ink, you can print any color you like.

Why is color printing more expensive?

Printing in color costs much more than printing in black-and-white, for various
reasons. First, and most obviously, there are four inks involved instead of just
one and each is printed by its own printing plate, so the cost of making the
printing plates alone is several times greater. Second, color printing presses
need to be able to print the four inks on the page one after another, in perfect
alignment, so they need to be considerably more sophisticated and precise.
Third, it takes extra time and effort for the person operating the printer to
check that the colors have been aligned and reproduced successfully, so
there's more human effort involved. Finally, because color printing is often
used for reproducing photographs, heavier, glossier, and more expensive
paper is usually needed to do it justice.

Sometimes designers get around the cost of color printing by using different
colored papers and inks. So, instead of printing black ink on white paper, they
might print black ink on red paper or red ink on yellow paper. That achieves a
colorful effect but keeps the cost down by still using only a single color of ink.
Another option is to use spot-color printing, where a single, specially mixed
color is applied to a black-and-white document—though that is labor intensive
and can work out even more expensive than four-color printing. Another
alternative is to use two- or three-color printing, in which pages are printed
with black and one or two other colors. If you were using just cyan and
magenta inks, for example, you could create a whole range of reds and blues
and print quite colorful pages without the expense of four-color printing.
Another option is to print some pages with the four-color process and other
pages with only black-and-white. Books that contain photographs are often
made this way, with the art pages printed through the four-color process on
glossy paper that's bound inside text pages printed with the black-and-white
process on ordinary paper.

Who invented printing?


If your immediate answer was "Johannes Gutenberg," you're only half-right.
As this whistle-stop tour through printing history will show you, the celebrated
German appeared only halfway through the story—one of many people who
made printing what it is today.
Artwork: A drawing of Ottmar Mergenthaler's revolutionary Linotype typesetting machine, taken from his
original US patent #543,497: Linotype machine, courtesy of US Patent and Trademark Office.

 ~3000-1000BCE: Ancient Babylonia: People


use signet stones (stones with designs cut into
their surface) dipped in pigment (paint) to print
their signatures in an early example of gravure
printing.
 ~30BCE–500CE: Ancient Rome: Slaves
laboriously copy out manuscripts by hand.
 105CE: The Chinese invent the first paper,
based on tree bark.
 ~500CE: The Chinese perfect printing from a
single wooden block into which designs are
slowly and laboriously engraved.
 ~751CE: A book called Mugujeonggwang
Daedaranigyeong (The Great Dharani Sutra)
is printed with wooden blocks. Today, it's
believed to be the oldest surviving printed text
in the world.
 ~900CE: Wooden block printing is further
developed in Goryeo (a kingdom of Korea that
lasted from the 10th-14th centuries).
 ~1040CE (11th century): A Chinese printer
called Bi Sheng invents the idea of printing
with movable type. He makes lots of
small clay blocks, each containing a separate
letter or character in relief, and rearranges
them in a printing frame so he can print many
different things. Unfortunately, because the
Chinese language can use thousands of
different characters, the idea doesn't
immediately catch on; printers prefer to carry
on using carved wooden blocks.
 11th-13th centuries: In Goryeo, scholars
produce the Tripitaka Koreana, a collection of
Buddhist scriptures carved onto some 81,000
wooden printing blocks.
 12th-14th centuries: The technology
of papermaking is transferred from eastern to
western countries.
 Late 1300s: Block printing is first used in
Europe.
 1377: In Goryeo, a two-volume book
called Jikji (an anthology of Zen Buddhist
teachings) is printed with metal type almost 80
years before Gutenberg. Only one copy
survives, currently preserved in the National
Museum of Korea in Seoul.
 1450: Johannes Gutenberg develops the
first modern printing press using movable
metal type (with each small printing letter or
character cast in relief out of metal). Note that
he didn't invent either the printing press or
movable type: his innovation was to bring
these things together in a powerful new way
that caught on and spread rapidly through the
world.
 1456: Gutenberg prints the famous Gutenberg
Bible.
 ~1500: Thousands of printers have adopted
Gutenberg's method and millions of books are
being printed.
 1798: German actor and writer Alois
Senefelder invents lithography, the water-and-
grease printing process used in modern offset
printing.
 1800: English nobleman Charles, Earl of
Stanhope makes the first iron printing press.
 1803: Brothers Henry and Sealy Fourdrinier
invent the modern papermaking machine,
based on a series of huge rollers arranged in a
row.
 1814: Long before electric power becomes
widely available, Friedrich König invents the
steam-driven printing press to speed up the
laborious printing process. It's a type of flatbed
cylinder press, in which the cylinder is
powered by a steam engine.
 1846: Richard March Hoe develops the rotary
press for newspaper printing and later perfects
it so it can print on both sides of the paper at
up to 20,000 pages per hour.
 1863: William Bullock invents the web-feb
rotary press for printing newspapers from giant
rolls of paper.
 1868: Christopher Latham Sholes develops a
machine for printing personal letters and other
writing—the typewriter with its QWERTY
keyboard.
 1880s: American brothers Max and Louis Levy
develop halftone printing.
 1886: Ottmar Mergenthaler invents
the Linotype machine, an automated way of
creating "hot metal" type in a printing plate by
casting a whole line of a book or magazine at
a time. Typesetting, as this innovation is
known, allows newspapers to be printed more
quickly and efficiently than ever before.
Photo: A vintage Linotype machine viewed from the side. The
person operating the machine enters text on the keyboard (at the
bottom). The machine then creates a mold of each line of text
that's filled with molten metal to form a line of type that can be
used for printing. Photo by Carol M. Highsmith, from Gates
Frontiers Fund Colorado Collection within the Carol M. Highsmith
Archive, Library of Congress, Prints and Photographs Division.

 1887: American inventor Tolbert


Lanston develops a rival typesetting system
called Monotype, which sets each letter or
character as a separate piece of type instead
of a whole line.
 1905: Ira Rubel develops offset printing.
 1912: Walter Hess of Switzerland is one of the
first people to experiment with lenticular
printing (putting lenses over printed paper to
make a 3D effect).
 1938: Chester Carlson invents the basic
principle of the photocopier (a way of
reproducing documents almost instantly using
static electricity), though another decade
passes before the first commercial copier goes
on sale and the invention isn't taken up widely
until marketed by Xerox in the 1960s and
1970s.
 1949: Phototypesetting machines are
invented, which produce type by photographic
methods instead of using "hot metal." First is
the Lumitype, invented at ITT by Frenchmen,
René Higonnet and Louis Moyroud. and later
developed by the American Lithomat
corporation, which markets the machine as the
Lumitype Photon.
 1967: Gary Starkweather of Xerox gets the
idea to develop a laser printer, and is finally
granted a patent 10 years later. The machine
he produces is very large and cumbersome by
today's standards.
 1980s: Scott Crump of Stratasys pioneers the
modern approach to 3D printing called fused
deposition modeling (bulding up a 3D object
from hot plastic, layer by layer).
 1984: Steve Jobs launches the Apple
Macintosh computer (loosely based on the
earlier Xerox Alto), which, partnered with a
compact and (relatively) affordable laser
printer, begins a revolution in desktop
publishing.
 1989: Tim Berners-Lee writes a proposal for
an online publishing system called the World
Wide Web, which makes it possible to publish
documents instantly and view them anywhere
else in the world, seconds later, using
the Internet.

Laser printers


by Chris Woodford. Last updated: December 26, 2020.

H ave you ever tried writing with a beam of light? Sounds impossible, doesn't it, but it's

exactly what a laser printer does when it makes a permanent copy of data (information) from
your computer on a piece of paper. Thanks to sci-fi and spy movies, we tend to think of lasers as
incredibly powerful light beams that can slice through chunks of metal or blast enemy spaceships
into smithereens. But tiny lasers are useful too in a much more humdrum way: they read sounds
and video clips off the discs in CD and DVD players and they're vital parts of most office
computers printers. All set? Okay, let's take a closer look at how laser printers work!

Photo: A compact laser printer doesn't look that different to an inkjet printer, but it puts ink on the page in a completely
different way. An inkjet printer uses heat to squirt drops of wet ink from hot, syringe-like tubes, while a laser printer uses
static electricity to transfer a dry ink powder called toner.

Contents

1. Laser printers are similar to photocopiers


2. How a laser printer works
3. Who invented laser printers?
4. The first laser printer
5. Are laser printers bad for you?
6. Find out more

Laser printers are similar to photocopiers

Photo: Ink sticks to a laser printer's drum the way this balloon sticks to my pullover: using static electricity.
Laser printers are a lot like photocopiers and use the same basic technology. Indeed, as we
describe later in this article, the first laser printers were actually built from modified
photocopiers. In a photocopier, a bright light is used to make an exact copy of a printed page.
The light reflects off the page onto a light-sensitive drum; static electricity (the effect that makes
a balloon stick to your clothes if you rub it a few times) makes ink particles stick to the drum;
and the ink is then transferred to paper and "fused" to its surface by hot rollers. A laser printer
works in almost exactly the same way, with one important difference: because there is no
original page to copy, the laser has to write it out from scratch.

Imagine you're a computer packed full of data. The information you store is in electronic format:
each piece of data is stored electronically by a microscopically small switching device called
a transistor. The printer's job is to convert this electronic data back into words and pictures: in
effect, to turn electricity into ink. With an inkjet printer, it's easy to see how that happens: ink
guns, operated electrically, fire precise streams of ink at the page. With a laser printer, things are
slightly more complex. The electronic data from your computer is used to control a laser beam—
and it's the laser that gets the ink on the page, using static electricity in a similar way to a
photocopier.

How a laser printer works


When you print something, your computer sends a vast stream of electronic data (typically a few
megabytes or million characters) to your laser printer. An electronic circuit in the printer figures
out what all this data means and what it needs to look like on the page. It makes a laser beam
scan back and forth across a drum inside the printer, building up a pattern of static electricity.
The static electricity attracts onto the page a kind of powdered ink called toner. Finally, as in a
photocopier, a fuser unit bonds the toner to the paper.
1. Millions of bytes (characters) of data stream into the
printer from your computer.
2. An electronic circuit in the printer (effectively, a small
computer in its own right) figures out how to print this
data so it looks correct on the page.
3. The electronic circuit activates the corona wire. This is a
high-voltage wire that gives a static electric charge to
anything nearby.
4. The corona wire charges up the photoreceptor drum so
the drum gains a positive charge spread uniformly across
its surface.
5. At the same time, the circuit activates the laser to make it
draw the image of the page onto the drum. The laser beam
doesn't actually move: it bounces off a moving mirror that
scans it over the drum. Where the laser beam hits the
drum, it erases the positive charge that was there and
creates an area of negative charge instead. Gradually, an
image of the entire page builds up on the drum: where the
page should be white, there are areas with a positive
charge; where the page should be black, there are areas of
negative charge.
6. An ink roller touching the photoreceptor drum coats it
with tiny particles of powdered ink (toner). The toner has
been given a positive electrical charge, so it sticks to the
parts of the photoreceptor drum that have a negative
charge (remember that opposite electrical charges attract
in the same way that opposite poles of a magnet attract).
No ink is attracted to the parts of the drum that have a
positive charge. An inked image of the page builds up on
the drum.
7. A sheet of paper from a hopper on the other side of the
printer feeds up toward the drum. As it moves along, the
paper is given a strong negative electrical charge by
another corona wire.
8. When the paper moves near the drum, its negative charge
attracts the positively charged toner particles away from
the drum. The image is transferred from the drum onto
the paper but, for the moment, the toner particles are just
resting lightly on the paper's surface.
9. The inked paper passes through two hot rollers (the fuser
unit). The heat and pressure from the rollers fuse the
toner particles permanently into the fibers of the paper.
10. The printout emerges from the side of the copier. Thanks
to the fuser unit, the paper is still warm. It's literally hot
off the press!

Sponsored links

Who invented laser printers?


Until the early 1980s, hardly anyone had a personal or office computer; the few people who did
made "hardcopies" (printouts) with dot-matrix printers. These relatively slow machines made a
characteristically horrible screeching noise because they used a grid of tiny metal needles,
pressed against an inked ribbon, to form the shapes of letters, numbers, and symbols on the page.
They printed each character individually, line by line, at a typical speed of about 80 characters
(one line of text) per second, so a page would take about a minute to print. Although that sounds
slow compared to modern laser printers, it was a lot faster than most people could bash out
letters and reports with an old-style typewriter (the mechanical or electric keyboard-operated
printing machines that were used in offices for writing letters before affordable computers made
them obsolete). You still occasionally see bills and address labels printed by dot-matrix; you can
always tell because the print is relatively crude and made up of very visible dots. In the mid-
1980s, as computers became more popular with small businesses, people wanted machines that
could produce letters and reports as quickly as dot-matrix printers but with the same kind of print
quality they could get from old-fashioned typewriters. The door was open for laser printers!

Fortunately, laser-printing technology was already on the way. The first laser printers had been
developed in the late 1960s by Gary Starkweather of Xerox, who based his work on the
photocopiers that had made Xerox such a successful corporation. By the mid-1970s, Xerox was
producing a commercial laser printer—a modified photocopier with images drawn by a laser—
called the Dover, which could knock off about 60 pages a minute (one per second) and sold for
the stupendous sum of $300,000. By the late 1970s, big computer companies, including IBM,
Hewlett-Packard, and Canon, were competing to develop affordable laser printers, though the
machines they came up with were roughly 2–3 times bigger than modern ones—about the same
size as very large photocopiers.

Two machines were responsible for making laser printers into mass-market items. One was
the LaserJet, released by Hewlett-Packard (HP) in 1984 at a relatively affordable $3495. The
other, Apple's LaserWriter, originally cost almost twice as much ($6995) when it was launched
the following year to accompany the Apple Macintosh computer. Even so, it had a huge impact:
the Macintosh was very easy to use and, with relatively inexpensive desktop-publishing software
and a laser printer, it meant almost anyone could turn out books, magazines, and anything and
everything else you could print onto paper. Xerox might have developed the technology, but it
was HP and Apple who sold it to the world!

The first laser printer


Dipping into the archives of the US Patent and Trademark Office, I've found one of Gary
Starkweather's original laser-printer designs, patented on June 7, 1977. To make it easier to
follow, I've colored it in and annotated it more simply than the technical drawing in the original
patent (if you wish, you can find the full details filed under US Patent 4027961: Copier/Raster
Scan Apparatus).

What we have is essentially a laser scanning unit (colored blue) sitting on top of a fairly
conventional, large office photocopier (colored red). In Starkweather's design, the laser scanner
slides on and off the glass window of the photocopier (the place where you would normally put
your documents, face down), so the same machine can be used as either a laser printer or a copier
—anticipating all-in-one office machines by about 20–25 years.
Artwork: Gary Starkweather's orginal laser printer design from US Patent 4027961: Copier/Raster Scan Apparatus,
courtesy of US Patent and Trademark Office.

How does it work?

1. The laser scanner creates the image.


2. The image is beamed through the glass copier window
into the copier mechanism underneath.
3. The image is reflected by a mirror.
4. A lens focuses the image.
5. A second mirror reflects the image again.
6. The image is transferred onto the photocopier belt.
7. A developer unit converts the image into printable form.
8. The printable image is transferred to the paper.
9. The fuser permanently seals the image onto the page,
which emerges into the collecting rack at top of the
machine.

Are laser printers bad for you?


I used to share an office with someone who refused to share our office with a
laser printer; we had to move our machine into a closet and keep the door
shut tight. This kind of worry is far from rare, but is it simply superstition? As
we saw up above, laser printers use a type of solid ink called toner, which can
be a source of dusty, fine particulates (remember that sooty particulates,
released by such things as car tailpipes, are one of the more worrying
ingredients in urban air pollution). One recent study found some printers emit
nearly 10 billion particles per printed page (although it's important to note that
the type and quantity of particle emissions vary widely from model to model).
They also produce volatile organic compounds (VOCs) and a gas called
ozone (a very reactive type of oxygen with the chemical formula O ), which is
3

toxic and, at high enough concentrations, produces a variety of health


impacts. Thankfully, ozone is transformed into ordinary oxygen (O ) relatively
2

quickly inside buildings.

Do printers and copiers present any risk to our health? A few scientific studies
have been done; although the results are mixed, they do seem to suggest it's
well worth taking precautions, such as placing your printer well away from
your workstation, if you use it a great deal, and ensuring good ventilation. You
should also take great care when changing toner cartridges or handling empty
ones. You'll find a list of recent studies in the further reading below.

Electronic books

by Chris Woodford. Last updated: July 21, 2020.


B ack in the 19th century English author, Martin Tupper wrote: "A good

book is the best of friends, the same today and for ever." It's true: books are
friendly, familiar, and loveable and that probably explains why it's taking us so
long to get used to the idea of portable electronic books. But with the arrival of
a new generation of electronic book readers, notably the Amazon Kindle,
many people started to wonder if the days of the printed word just might be
numbered. Let's take a closer look at electronic books (ebooks) and find out
how they work!
Photo: The easiest way to read books electronically is to buy them through your smartphone and read them on-
screen with an app. Here I'm about to make a start on Vladimir Nabokov's Pale Fire. The small page size is a
big drawback, and you might not enjoy reading off an LCD screen for long periods. On the other hand, if your
phone goes with you everywhere, this is a great way to carry books around without the extra weight and bulk.

Contents

1. Two in one: books... and the information they contain


2. How do you store a book in electronic form?
3. How do you read an electronic book file?
4. How does E Ink® work?
5. How does electronic ink and paper work?
6. Which electronic book reader should you buy?
7. Who invented electronic books?
8. Find out more

Two in one: books... and the information they contain


Think of a book and you think of a single object, but the books we read are
actually two things in one: there's the information (the words and pictures and
their meaning) and there's the physical object (the paper, cardboard, and ink)
that contains them. Sometimes the physical part of a book is as important as
the information it carries: it's really true that we judge books by their covers—
at least when we're standing in shops deciding which ones to buy—and that's
why publishers devote so much attention to making their books look attractive.
But, a lot of the time, the information is much more important to us and we
don't really care how it's delivered. That's why many of us now turn to
the Web when we want to find things out instead of visiting the local library.

In short, we've learned to split off the information we need from the way it's
delivered. Ebooks take this idea a step further. When we talk about an ebook,
we really mean a digital version of a printed text that we can read on a
handheld electronic device like a miniature laptop computer— two quite
separate things, once again.

Photos: 1) Amazon Kindle Paperwhite electronic book reader. 2) The Kindle's now obsolete rival, the Sony
Reader PRS-350, was considerably smaller and designed to carry around in your pocket. Although it's no
longer made, it's a good, everyday little reader and relatively easy to pick up on auction sites like eBay. Both
are smaller than the first generation of Kindles, because their touch-sensitive screens do away with the need
for a separate keyboard.

Sponsored links

How do you store a book in electronic form?


An ebook is really just a computer file full of words (and sometimes images).
In theory, you could make an ebook just by typing information into a word
processor. The file you save has all the elements of an electronic book: you
can read the information on a computer, search it for keywords, or share it
easily with someone else.

The first attempt to create a worldwide library of ebooks was called Project
Gutenberg and it's still running today. Long before the World Wide Web came
along, a bunch of dedicated Gutenberg volunteers took printed books and
scanned or typed them into their computers to make electronic files they could
share. For legal reasons, these books were (and still are) mostly classic old
volumes that had fallen out of copyright. The electronic versions of these
printed books are very basic, text-only computer files stored in a format
called ASCII (American Standard Code for Information Interchange)—a way
of representing letters, numbers, and symbols with the numbers 0-255 that
virtually every computer can understand.

Photo: The Amazon Kindle Paperwhite electronic book reader (left) alongside the rival Sony eReader (right).
This Kindle has a fairly unobtrusive set of LED lights built around the screen to make reading easier in the dim
evening light. Although it's hard to see in this photo, the Paperwhite does have a much whiter screen than the
Sony. Even so, I find the text much sharper on the Sony. It's also worth pointing out that I've owned two of
these Sonys and the screen on one was noticeably better than the other. In other words, the quality of ebook
screens definitely does vary.

The problem with ASCII is that the text contains very little formatting
information: you can't distinguish headings from text, there's only one basic
font, and there's no bold or italics. That's why people developed much more
sophisticated electronic files like PDF (Portable Document Format). The basic
idea of PDF was to store an almost exact replica of a printed document in an
electronic file that people could easily read on screens or print out, if they
preferred. The HTML files people use to create web pages are another kind of
electronic information. Every HTML page on a website is a bit like a separate
page in a book, but the links on web pages mean you can easily hop around
until you find exactly the information you want. The links on websites give you
powerfully interconnected information that is often much quicker to use than a
library of printed books.

The greatest strength of ASCII, PDF, and HTML files (you can read them on
any computer) is also their greatest weakness: who wants to sit staring at a
computer screen, reading thousands of words? Most screens are much less
sharp than the type in a printed book and it quickly tires your eyes reading in
this way. Even if you can store lots of books on your computer, you can't really
take it to bed with you or read it on the beach or in the bath-tub! Now, there's
nothing to stop you downloading simple text files onto something like an iPod
or a cellphone and reading them, very slowly and painfully, from the
small LCD display—but it's not most people's idea of curling up with a good
book. Cellphones have very bright displays that can interfere with your sleep if
you use them late in the evenings (but the stuff you may have read about blue
light interfering with your sleep isn't as clearcut as it sounds.) What we really
need is something with the power of a computer, the portability of a cellphone,
and the friendliness and readability of a printed book. And that's exactly where
electronic book readers come in.

How do you read an electronic book file?


An electronic book reader is a small, portable computer designed for reading
books stored in a digital format such as ASCII, PDF, HTML, RTF, or another
similar format. (Currently the two most popular ebook formats are EPUB, a
worldwide, open standard that evolved from an earlier standard called OEB
(Open ebook) and widely used by Sony Readers and most other ebook
readers, and AZW, a proprietary format developed by Amazon and currently
readable only on its Kindle reader. There are a few other formats including
MOBI and LRF, but you don't hear about them so much.) However you go
about it, books take up very little space when you store them in electronic
format: you could easily fit 10,000 electronic copies of the Bible onto a
single DVD. Most ebook readers can store hundreds or even thousands of
titles at a time and most now have Wi-Fi Internet connections so you can
download more books whenever you wish.

Photo: You can read electronic books right now, even if you don't have a handheld ebook reader. There are
lots of ebook reader apps and there's free electronic book software available for all the popular PC operating
systems. You can also download versions of the Amazon Kindle that work on a PC, Mac, iPod/iPad, iPhone, or
Android smartphone. Here's the Caliber electronic book reader running on a normal computer screen, showing
the first page of F.Scott Fitzgerald's The Beautiful and Damned.

The most important part of an ebook reader is the screen. The first ebooks
used small versions of LCD laptop screens which have a resolution
(sharpness) of about 35 pixels per cm (90 pixels per inch). You could easily
see the dots making up the letters and it was quite tiring to read for more than
a few minutes at a time. The latest ebooks use an entirely different technology
called electronic ink. Instead of using LCD displays, they show words and
letters using tiny, black and white plastic granules that move about inside
microscopic, spherical capsules under precise electronic control. Displays like
this have about twice the resolution of ordinary computer screens, are clearly
visible in sunlight, and use much less power. In fact, they're almost as sharp
and easy to read as printed paper. We'll see how these screens work in a
moment.

The lack of books in electronic format was one of the things that used to put
people off using ebook readers—and that's what made Amazon.com's Kindle
reader such an instant success. Amazon already worked with virtually all the
world's publishers as a bookseller, so it was able to make huge numbers of
titles available for Kindle in electronic format—over 88,000 books were
available on the launch date. Today, most books are available in ebook format
as well as print, and many old, long-out-of-print titles have also been
resurrected in ebook form. When I first got an ebook reader, about a decade
ago, it was quite hard to find most books in electronic form; today, the position
has completely reversed and you can find almost everything in ebook form
without much effort.

How does E Ink® work?


Photo: Computer screens as we knew them in the late 1970s and early 1980s. At that time, the best screens
could display no more than about 64,000 pixels and often just uppercase text or very crude "pixelated" (square
block) graphics. Computer games like Space Invaders, shown here, were very primitive—but still highly
addictive!

Since electronic ink has been crucial to the success of ebooks, let's now take
a detailed look at how it works.

You're probably reading these words in the same way that I am—by staring at
a flat, LCD computer or smartphone screen. For people over the age of about
35, who grew up with computers that used blocky green and black screens
with just 40 characters across and 25 down, modern screens are wonderful
and amazing. But they still have their drawbacks. Look closely, and you can
see jagged edges to the letters. Try to read an LCD screen in direct sunlight
and (unless the screen has a very bright backlight), you'll really struggle. But
the worse thing is that LCD screens lack the lightness, portability, and sheer
user-friendliness of ink-printed paper: you can happily read a book for hours,
but try the same trick with a computer screen and your eyes will quickly tire.

Photo: LCD versus E Ink®: The E Ink display on a Sony Reader (bottom) is much sharper and easier to read
than a typical LCD screen (top). Magnifying by about 8–10 times and zooming in on a single word, you can see
why. The E Ink display makes sharper letters with a uniformly white background. The LCD display blurs its
letters with anti-aliasing to make them less jagged, though that makes them harder to read close up. The red,
blue, and green colored pixels used to make up the LCD's "white" background are also much more noticeable.
Unlike the E Ink display, an LCD does not use a true white background: it relies on your eye and brain to fuse
colored pixels instead. The resolution of E Ink is also far greater: typical LCD displays use around 90 pixels per
inch, whereas E Ink displays use at least twice as many pixels.

Back in the early 1970s, the Xerox Corporation that had


pioneered photocopiers a decade earlier became concerned about the threat
that computers might pose to its core ink-and-paper business: if everyone
started using computers, and offices became paperless, what would happen
to a company so utterly dependent on paper technology? It was for that
reason that Xerox pumped huge amounts of money into PARC™, (Palto Alto
Research Center), the now-legendary campus where modern, user-friendly
personal computing was pioneered. Among the many innovations developed
there were personal computers that used a graphical user interface (the
"desktop" screen featuring icons, later copied by the Apple Macintosh® and
Microsoft Windows®), Ethernet networking, laser printers... and electronic
paper, which was invented by PARC researcher Nick Sheridon.

The basic idea of electronic ink and paper was (and remains) very simple: to
produce an electronic display with all the control and convenience of a
computer screen but the readability, portability, and user-friendliness of paper.

How does electronic ink and paper work?


Most electronic ink and paper screens use a technology
called electrophoresis, which sounds complex but simply means
using electricity to move tiny particles (in this case ink) through a fluid (in this
case a liquid or gel). Other uses of electrophoresis include DNA testing, where
electricity is used to separate the parts of a DNA sample by making them
move across a gel, which enables them to be compared with other samples
and identified.

In one of the best-known electronic ink products, called E Ink® and used in
ebooks such as the Amazon Kindle, there are millions of microcapsules,
roughly the same diameter as a human hair, each of which is the equivalent of
a single pixel (one of the tiny squares or rectangles from which the picture on
a computer or TV screen is built up). Each capsule is filled with a clear fluid
and contains two kinds of tiny ink granules: white ones (which are positively
charged) and black ones (which are negatively charged). The capsules are
suspended between electrodes switched on and off by an electronic circuit,
and each one can be controlled individually. By changing the electric field
between the electrodes, it's possible to make the white or black granules
move to the top of a capsule (the part closest to the reader's eye) so it
appears like a white or black pixel. By controlling large numbers of pixels in
this way, it's possible to display text or pictures.
Animation: Electronic ink works through electrophoresis. Each pixel (microcapsule) in the display (the gray
circle) contains black (negatively charged) and white (positively charged) ink granules. When a positive field
(shown in blue) is applied to the top electrode, the black capsules migrate to the top, making the pixel look
black when seen from above; switching the field over makes the granules change position so the pixel appears
white.

Advantages and disadvantages of electronic ink

If you've tried reading an electronic book, you'll know that electronic ink and
paper is much easier to read from for long periods than an LCD computer
screen. Since the microcapsules stay in position indefinitely, with little or no
electric current, electronic ink displays have extremely low power
consumption. A typical ebook reader with an E Ink display can be used for
something like 2–4 weeks of average everyday reading on a single charge—
which is much less power than a laptop, tablet, or smartphone. Low power
consumption means low energy use and that translates into an environmental
benefit; in other words, electronic ink and paper is environmentally friendly.
What about the energy needed to manufacture your reader in the first place?
According to British environmental auditor Nicola Terry, who's done the math,
you've only to read 20–70 ebooks to offset that energy; the Cleantech group
has estimated that a Kindle makes savings in carbon emissions after just one
year's use. That sounds great, but do bear in mind that emissions from
reading and producing books are a trivial part of most people's total emissions
(the majority of which stem from transport, consumed goods, home heating,
and so on).

The disadvantages are less obvious until you start using electronic ink in
earnest. First, although the displays work excellently in bright indoor light and
daylight, including direct sunlight, they have no light of their own (unlike LCD
displays, which have backlights shining through them from back to front so
you can see them). That makes them hard to use in poor indoor light,
especially in the evenings, which is why many early ebook readers were sold
with clumsy addon lamps. Fortunately, firms like Amazon now offer readers
with built-in lights (like the Kindle Paperwhite in our top photo), so the problem
of straining over ebooks in the dark has now largely disappeared. But I find
readers like this can be as tiring to look at for long periods as a smartphone,
so I prefer to adjust the room lighting instead.

Photo: Night and day, are you the one? Here I've propped a Sony Reader against the screen of a conventional
laptop and photographed it in different light conditions. Left: In bright light or daytime outdoors, electronic ink
displays are much easier to read than backlit LCD displays, which become virtually invisible. Right: In dark
indoor light in the evenings, things are reversed: LCDs are much easier to read and electronic ink displays are
a struggle to decipher unless you sit in strong light (or use a clip-on light attachment).

Electronic ink also takes much longer to build up the image of a page than an
LCD screen, which means it's unsuitable for everyday computer displays
using any kind of moving image (and completely unsuitable for fast-moving
images such as computer games and videos). Sometimes parts of a previous
page linger on as "ghosts" until you've turned another page or two. You've
probably noticed that, when you "turn the page" of an electronic book, the
entire screen momentarily flashes black before the new page is displayed?
That's a rather clumsy compromise to prevent ghosts, in which the screen
tries to erase the previous page before displaying a new one (a bit like Etch-a-
Sketch®!).
Another major disadvantage is that most electronic ink displays are currently
black and white. Crude color displays do exist (E Ink has produced one called
Triton since 2010, in which a layer of red, green, and blue color filters is
mounted over the usual black-and-white microcapsules) and better ones are
in development, but they're much more expensive than their black-and-white
electronic paper (or LCD equivalents) and only display a relatively small
number of colors (Triton can manage 4000, compared to about 17 million on a
decent LCD). In time, we're bound to have color electronic books and
magazines, but don't hold your breath. Amazon's Jeff Bezos, speaking in mid-
2009, said that a color Kindle ebook reader was "multiple years" away: "I've
seen the color displays in the laboratory and I can assure you they're not
ready for prime time." That was why, when Amazon first shipped its color
Kindle Fire™ product in September 2011, it had an LCD display. In May 2016,
E Ink announced a color replacement for Triton called Advanced Color ePaper
(ACeP), but it's still some way from making it into Kindles and other ebook
readers.

Which electronic book reader should you buy?


First things first

Before you buy a dedicated reader, try experimenting with your smartphone or
laptop first with something like the Amazon Kindle app or Google Books
(which lets you buy ebooks from the Google Play store). Don't forget to check
out your local library: see whether they offer free support for OverDrive,
cloudLibrary, or something similar. Also remember that you can often
download books from publishers' websites, which is a good way to put more
money in the hands of people who produce the books you love. If you'd rather
get an ebook reader, you have a choice to make...

Choosing your reader

Kindle, Sony Reader, Nook, Elonex—which one should you buy? The
decision is a little easier than it used to be now Sony has (regrettably) stopped
making ebook readers, though I'd still recommend looking out for cheap
secondhand Sonys on auction sites. (My first Sony Reader cost a little under
US$200; when I broke it, five years later, I picked up a mint-condition
replacement on eBay for about US$25!)

They're all broadly similar: they're all light, portable, and handheld and they all
have large internal flash memories that hold hundreds or thousands of books.
Some have touchscreens; others (like the older and cheaper Kindles) have
miniature keyboards. Some have wireless connections for downloading more
books; others (such as the Sony Readers) have to be connected to a
computer with a USB cable. If you connect with USB, running an ebook
reader is rather like running an iPod or MP3 player: typically you maintain a
library on your PC with a piece of software similar to iTunes, to which you add
and remove books and other documents. When you plug in your reader, it
"syncs" (synchronizes) its internal memory with the library on your PC, adding
any new books and deleting any unwanted ones. If you have a wireless
reader, you maintain your library on the reader itself or in the cloud (stored on
a remote computer somewhere and accessed online), adding and removing
books directly. So... wireless or cable? It's not a big issue, I don't think, though
elderly people who have little experience of using a computer may find buying
books easier with something like a wireless Kindle with its built-in, easy-to-use
bookstore).

Photo: Horses for courses: Portable versions of the Sony Reader have a much smaller page size than a typical
hardbook book. That's great if you want to carry your reader in your jacket pocket or your handbag so you can
read while you're travelling. It's much less attractive if you do most of your reading at home: the smaller the
screen, the more often you'll need to turn the pages. This is one example of why it pays to think about how
you're going to use an ebook reader before you buy it.

Displays and batteries

The best and most expensive readers use extremely high-resolution E Ink
screens that work better in daylight than at night (you'll need good indoor
lighting or a clip-on light if you're planning to do most of your e-reading in the
evenings); LCD-screen readers (such as the Elonex) have backlit screens that
favor indoor use and (like computer screens) can be tricky to read in bright
sunlight. Amazon's current state-of-the-art reader, the Kindle Paperwhite, has
discreet little LED lights built around the edge of the screen that make a
noticeable difference when you're reading indoors in the dim evening light.

E Ink apparently uses energy only when you turn the pages, so the Sony
Reader can happily survive for about two weeks of very heavy use on a single
charge of the batteries, while the Kindle Paperwhite claims up to eight weeks
of battery time. That means it's also very environmentally friendly to read
books or documents from a handheld ebook reader compared to reading them
on a computer screen.

Some ebook readers can cope with ebooks in all kinds of different formats.
The Sony Reader, for example, lets you read Microsoft Word and PDF files,
as well as standard formats such as EPUB. The PDF viewer is really neat,
allowing you to rotate the screen or scroll documents column-by-column for
easy reading. The Amazon Kindle doesn't currently support the EPUB format,
but it does allow you to view other file formats such as PDF. You can also mail
documents to your Kindle, which is something you can't do on a Sony.

Photo: You can use the Sony Reader in "landscape" orientation if you find that easier, though you have to
switch it over manually from the keyboard (unlike with a smartphone, the display doesn't rotate itself). Here I'm
reading a PDF file of Sustainable Energy—Without the hot air by physicist David MacKay. If environmental
issues matter to you, reading documents on an ebook reader like this might appeal, because it uses a fraction
as much energy as a laptop. The text is much more legible than it appears in this photograph.

Finding ebooks

Most books currently produced by publishers are copyrighted, which means


you can (and should) expect to pay a fair price if you want to use them. A
decade ago, when I first wrote this article, relatively few publishers had
embraced ebooks. Today, most publishers make most new books available in
at least one electronic format, and many sell direct to readers from their own
websites, but they're taking their time making backlist and out-of-print titles
available this way. Generally, it's relatively easy to find new mass-market
bestsellers in ebook format but harder to find more specialized books and
quality, literary fiction. Public domain classics are the easiest books to find in
ebook format, largely thanks to the sterling and visionary work of Project
Gutenberg (and, more recently, the Open Library, which currently promises
over a million free ebook titles, though in my experience their scans are of
much poorer quality and their EPUB files are littered with errors to the point of
often being unreadable). If you enjoy reading classic novels, buying an ebook
reader is probably a no-brainer; if you're more a fan of modern literary fiction,
you might have a harder time finding what you want in digital form.

Unfortunately, the standard of production for ebooks is noticeably lower and


sloppier than it is for print books: expect to find scanning and formatting
errors, missing endnotes, redacted photos (because of copyright issues),
artworks that are blurred or don't display properly, tables that don't fit on the
screen, and worse. It's quite obvious that publishers don't apply the same high
editorial and proofreading standards to printed books and ebooks. Take a
moment to send complaints direct to a publisher whenever you find a really
sloppy ebook, copy the author in if you can find contact details for them—and
be sure to ask for a refund.

If you buy copyright ebooks from either Amazon or another outlet, you'll find
they're protected by what's called DRM (digital rights management)—
effectively a kind of encryption that prevents people from distributing pirate
copies of books illegally. Amazon uses its own DRM system, while Sony (and
others) use a system developed by Adobe called Adobe Digital
Editions (ADE), which requires you to register your reader the first time you
use it. DRM protection restricts what you can do with books you've bought, but
it's not necessarily the drawback it seems. First, it's very much a necessity
from a publisher's point of view: it's only because ebook readers like the
Kindle have DRM protection built in that publishers haven taken what they see
as a major risk in making their books available in digital formats. Another
advantage of DRM is that it allows libraries to lend people ebooks for limited
periods of time (using systems like OverDrive® and cloudLibrary™). I'm
delighted to find I can log in to my local library and download, for free, for
periods of up to 14 days, a fair selection of a few hundred popular ebooks.
Once the borrowing time has expired, the books delete themselves
automatically from my reader. Be warned that some libraries allow lending
only in EPUB and PDF format, and you might not be able to borrow books on
a Kindle. Others don't let you download books to a reader at all: all you can do
is borrow files and read them for a couple of weeks using a dedicated
smartphone app. Currently, OverDrive seems to offer the most flexibility,
allowing download of ePUBs and AZW Kindle formats, with full support for
smartphones and PC apps as well. cloudLibrary

Who invented electronic books?


 ~3000BCE: Ancient Egyptians make the
first paper from the stem of the papyrus plant.
 105CE: Chinaman Ts'ai Lun develops modern
paper from hemp fiber.
 ~1450: German Johannes Gutenberg invents
the modern process of printing with movable
metal type, which leads to a vast increase in
the popularity of books.
 1945: In a famous article in Atlantic Monthly
called As We May Think, US government
scientist Vannevar Bush proposes a kind of
desk-sized memory store called Memex, which
has some of the features later incorporated
into electronic books and the World Wide Web
(WWW).
 1968: Computer scientist Alan Kay imagines a
portable computerized book, which he
nicknames the Dynabook.
 1971: Michael Hart launches Project
Gutenberg at the University of Illinois: an
electronic repository for classic, out-of-
copyright books.
 1990: Sony launches its Data Discman, a
portable electronic reader costing $550 that
stores and reads books from compact discs
(CD-ROMs). It is a commercial flop.
 1990s: Encyclopedia publishers such
as Britannica and Dorling Kindersley (DK)
experiment with making their books available
on interactive CD-ROMs. DK wins many
awards for its CD-ROMs, but closes its
multimedia business in the late 1990s as
competition mounts from the Internet.
 Late 1990s: Several new handheld, electronic
book readers are launched, including the
SoftBook, RocketBook, and Everybook—but
fail to make much impact on the marketplace.
 2000: Best-selling horror author Stephen
King launches a short novel called Riding the
Bullet in electronic format and sells over half a
million copies.
 2001: Larry Sanger and Jimmy Wales give the
world Wikipedia—an electronic encyclopedia
anyone can contribute to.
 2007: Amazon.com launches its wireless
Kindle reader with thousands of electronic
books available in electronic format, along with
newspapers, RSS feeds, and other forms of
"digital content."
 2010: Amazon Kindle becomes Amazon's
number one bestselling product, confirming
that electronic books (and readers) really have
arrived!
 2010: E Ink announces Triton, a colored
version of its ebook screen technology.
 2011: Project Gutenberg celebrates 40 years
of producing and distributing electronic books.
 2014: A report by Pricewaterhouse Coopers
predicts ebooks will outsell printed books by
2018, but UK bookstore founder Tim
Waterstone argues the market will go into
decline.
 2014: Sony stops selling its Readers after
growing sales of smartphones and tablets
cause a major fall in sales.
 2015: The Association of American Publishers
reports a dramatic reversal of fortune, with a
10 percent fall in sales of ebooks (which still
account for only a fifth of the market).
 2016: E Ink announces Advanced Color
ePaper (ACeP), an improved color screen.
 2018: Walmart announces it will challenge
Amazon's dominance of the ebook market in a
bold partnership with Japanese firm Rakuten.
 2018: Hachette boss Arnaud Nourry attacks
ebooks for their stupidity and lack of creativity.

LCDs (liquid crystal displays)


by Chris Woodford. Last updated: July 8, 2020.

T elevisions used to be hot, heavy, power-hungry beasts that sat in the

corner of your living room. Not any more! Now they're slim enough to hang on
the wall and they use a fraction as much energy as they used to. Like
laptop computers, most new televisions have flat screens with LCDs (liquid-
crystal displays)—the same technology we've been using for years in things
like calculators, cellphones, and digital watches. What are they and how do
they work? Let's take a closer look!
Photo: Small LCDs like this one have been widely used in calculators and digital watches since the 1970s, but
they were relatively expensive in those days and produced only black-and-white (actually, dark-blueish and
white) images. During the 1980s and 1990s, manufacturers figured out how to make larger color screens at
relatively affordable prices. That was when the market for LCD TVs and color laptop computers really took off.

Contents

1. What's different about LCDs?


2. What are liquid crystals?
3. What is polarized light?
4. How LCDs use liquid crystals and polarized light
5. How colored pixels in LCDs work
6. What's the difference between LCD and plasma?
7. A brief history of LCDs
8. Find out more

What's different about LCDs?

Photo: This iPod screen is another example of LCD technology. Its pixels are colored black and they're either
on or off, so the display is black-and-white. In an LCD TV screen, much smaller pixels colored red, blue, or
green make a brightly colored moving picture.

For many people, the most attractive thing about LCD TVs is not the way they
make a picture but their flat, compact screen. Unlike an old-style TV, an LCD
screen is flat enough to hang on your wall. That's because it generates its
picture in an entirely different way.

You probably know that an old-style cathode-ray tube (CRT) television makes
a picture using three electron guns. Think of them as three very fast, very
precise paintbrushes that dance back and forth, painting a moving image on
the back of the screen that you can watch when you sit in front of it.

Flatscreen LCD and plasma screens work in a completely different way. If you
sit up close to a flatscreen TV, you'll notice that the picture is made from
millions of tiny blocks called pixels (picture elements). Each one of these is
effectively a separate red, blue, or green light that can be switched on or off
very rapidly to make the moving color picture. The pixels are controlled in
completely different ways in plasma and LCD screens. In a plasma screen,
each pixel is a tiny fluorescent lamp switched on or off electronically. In an
LCD television, the pixels are switched on or off electronically using liquid
crystals to rotate polarized light. That's not as complex as it sounds! To
understand what's going on, first we need to understand what liquid crystals
are; then we need to look more closely at light and how it travels.
What are liquid crystals?

Photo: Liquid crystals dried and viewed through polarized light. You can see they have a much more regular
structure than an ordinary liquid. Photo from research by David Weitz courtesy of NASA Marshall Space Flight
Center (NASA-MSFC).

We're used to the idea that a given substance can be in one of three states:
solid, liquid, or gas—we call them states of matter—and up until the late 19th
century, scientists thought that was the end of the story. Then, in 1888, an
Austrian chemist named Friedrich Reinitzer (1857–1927) discovered liquid
crystals, which are another state entirely, somewhere in between liquids and
solids. Liquid crystals might have lingered in obscurity but for the fact that they
turned out to have some very useful properties.

Solids are frozen lumps of matter that stay put all by themselves, often with
their atoms packed in a neat, regular arrangement called a crystal (or
crystalline lattice). Liquids lack the order of solids and, though they stay put if
you keep them in a container, they flow relatively easily when you pour them
out. Now imagine a substance with some of the order of a solid and some of
the fluidity of a liquid. What you have is a liquid crystal—a kind of halfway
house in between. At any given moment, liquid crystals can be in one of
several possible "substates" (phases) somewhere in a limbo-land between
solid and liquid. The two most important liquid crystal phases are called
nematic and smectic:
 When they're in the nematic phase, liquid
crystals are a bit like a liquid: their molecules
can move around and shuffle past one
another, but they all point in broadly the same
direction. They're a bit like matches in a
matchbox: you can shake them and move
them about but they all keep pointing the same
way.
 If you cool liquid crystals, they shift over to
the smectic phase. Now the molecules form
into layers that can slide past one another
relatively easily. The molecules in a given
layer can move about within it, but they can't
and don't move into the other layers (a bit like
people working for different companies on
particular floors of an office block). There are
actually several different smectic "subphases,"
but we won't go into them in any more detail
here.

Find out more

Want to know more about liquid crystals? There's a great page called History
and Properties of Liquid Crystals, archived from the Nobel Prize website.

Sponsored links

What is polarized light?


Nematic liquid crystals have a really neat party trick. They can adopt a
twisted-up structure and, when you apply electricity to them, they straighten
out again. That may not sound much of a trick, but it's the key to how LCD
displays turn pixels on and off. To understand how liquid crystals can control
pixels, we need to know about polarized light.

Light is a mysterious thing. Sometimes it behaves like a stream of particles—


like a constant barrage of microscopic cannonballs carrying energy we can
see, through the air, at extremely high speed. Other times, light behaves more
like waves on the sea. Instead of water moving up and down, light is a wave
pattern of electrical and magnetic energy vibrating through space.

Photo: A trick of the polarized light: rotate one pair of polarizing sunglasses past another and you can block out
virtually all the light that normally passes through.

When sunlight streams down from the sky, the light waves are all mixed up
and vibrating in every possible direction. But if we put a filter in the way, with a
grid of lines arranged vertically like the openings in prison bars (only much
closer together), we can block out all the light waves except the ones vibrating
vertically (the only light waves that can get through vertical bars). Since we
block off much of the original sunlight, our filter effectively dims the light. This
is how polarizing sunglasses work: they cut out all but the sunlight vibrating in
one direction or plane. Light filtered in this way is called polarized or plane-
polarized light (because it can travel in only one plane).

Photo: A less well known trick of polarized light: it makes crystals gleam with amazing spectral colors due to a
phenomenon called pleochroism. Photo of protein and virus crystals, many of which were grown in space.
Credit: Dr. Alex McPherson, University of California, Irvine. Photo courtesy of NASA Marshall Space Flight
Center (NASA-MSFC).

If you have two pairs of polarizing sunglasses (and it won't work with ordinary
sunglasses), you can do a clever trick. If you put one pair directly in front of
the other, you should still be able to see through. But if you slowly rotate one
pair, and keep the other pair in the same place, you will see the light coming
through gradually getting darker. When the two pairs of sunglasses are at 90
degrees to each other, you won't be able to see through them at all. The first
pair of sunglasses blocks off all the light waves except ones vibrating
vertically. The second pair of sunglasses works in exactly the same way as
the first pair. If both pairs of glasses are pointing in the same direction, that's
fine—light waves vibrating vertically can still get through both. But if we turn
the second pair of glasses through 90 degrees, the light waves that made it
through the first pair of glasses can no longer make it through the second pair.
No light at all can get through two polarizing filters that are at 90 degrees to
one another.

How LCDs use liquid crystals and polarized light


Photo: Prove to yourself that an LCD display uses polarized light. Simply put on a pair of polarizing sunglasses
and rotate your head (or the display). You'll see the display at its brightest at one angle and at its darkest at
exactly 90 degrees to that angle.

An LCD TV screen uses the sunglasses trick to switch its colored pixels on or
off. At the back of the screen, there's a large bright light that shines out toward
the viewer. In front of this, there are the millions of pixels, each one made up
of smaller areas called sub-pixels that are colored red, blue, or green. Each
pixel has a polarizing glass filter behind it and another one in front of it at 90
degrees. That means the pixel normally looks dark. In between the two
polarizing filters there's a tiny twisted, nematic liquid crystal that can be
switched on or off (twisted or untwisted) electronically. When it's switched off,
it rotates the light passing through it through 90 degrees, effectively allowing
light to flow through the two polarizing filters and making the pixel look bright.
When it's switched on, it doesn't rotate the light, which is blocked by one of
the polarizers, and the pixel looks dark. Each pixel is controlled by a
separate transistor (a tiny electronic component) that can switch it on or off
many times each second.
Photo: How liquid crystals switch light on and off. In one orientation, polarized light cannot pass through the
crystals so they appear dark (left side photo). In a different orientation, polarized light passes through okay so
the crystals appear bright (right side photo). We can make the crystals change orientation—and switch their
pixels on and off—simply by applying an electric field. Photo from liquid crystal research by David Weitz
courtesy of NASA Marshall Space Flight Center (NASA-MSFC).

How colored pixels in LCDs work


There's a bright light at the back of your TV; there are lots of colored squares
flickering on and off at the front. What goes on in between? Here's how each
colored pixel is switched on or off:
How pixels are switched off

1. Light travels from the back of the TV toward


the front from a large bright light.
2. A horizontal polarizing filter in front of the light
blocks out all light waves except those
vibrating horizontally.
3. Only light waves vibrating horizontally can get
through.
4. A transistor switches off this pixel by
switching on the electricity flowing through its
liquid crystals. That makes the crystals
straighten out (so they're completely
untwisted), and the light travels straight
through them unchanged.
5. Light waves emerge from the liquid crystals
still vibrating horizontally.
6. A vertical polarizing filter in front of the liquid
crystals blocks out all light waves except those
vibrating vertically. The horizontally vibrating
light that travelled through the liquid crystals
cannot get through the vertical filter.
7. No light reaches the screen at this point. In
other words, this pixel is dark.

How pixels are switched on


1. The bright light at the back of the screen
shines as before.
2. The horizontal polarizing filter in front of the
light blocks out all light waves except those
vibrating horizontally.
3. Only light waves vibrating horizontally can get
through.
4. A transistor switches on this pixel by
switching off the electricity flowing through its
liquid crystals. That makes the crystals twist.
The twisted crystals rotate light waves by 90°
as they travel through.
5. Light waves that entered the liquid crystals
vibrating horizontally emerge from them
vibrating vertically.
6. The vertical polarizing filter in front of the liquid
crystals blocks out all light waves except those
vibrating vertically. The vertically vibrating light
that emerged from the liquid crystals can now
get through the vertical filter.
7. The pixel is lit up. A red, blue, or green filter
gives the pixel its color.

What's the difference between LCD and plasma?


A plasma screen looks similar to an LCD, but works in a completely different
way: each pixel is effectively a microscopic fluorescent lamp glowing with
plasma. A plasma is a very hot form of gas in which the atoms have blown
apart to make negatively charged electrons and positively charged ions
(atoms minus their electrons). These move about freely, producing a fuzzy
glow of light whenever they collide. Plasma screens can be made much
bigger than ordinary cathode-ray tube televisions, but they are also much
more expensive.

A brief history of LCDs

Artwork: Richard Williams set out the principle of LCD displays in US Patent 3,322,485. A layer of liquid
crystals (yellow) between two transparent plates (red) switches the display on and off when a voltage (blue) is
applied. Artwork courtesy of US Patent and Trademark Office.

 1888: Friedrich Reinitzer, an Austrian plant


scientist, discovers liquid crystals while
studying a chemical called cholesteryl
benzoate. It seems to have two distinct crystal
forms, one solid and one liquid, each with its
own melting point.
 1889: Building on Reinitzer's work, German
chemist and physicist Otto Lehmann coins the
term "liquid crystals" (originally, "flowing
crystals" or "fliessende Krystalle" in German)
and carries out more detailed research using
polarized light. Although his work is nominated
for a Nobel Prize, he never actually wins one.
 1962: RCA's Richard Williams begins to
research the optical properties of nematic
liquid crystals. He files his groundbreaking
LCD patent (US Patent 3,322,485) on
November 9, 1962 and it's finally granted
almost five years later on May 30, 1967.
 1960s: RCA engineers like George
Heilmeier build on this theoretical research to
produce the very first practical electronic
displays, hoping to create LCD televisions.
 1968: RCA publicly unveils LCD technology at
a press conference, prompting The New York
Times to anticipate products like "A thin
television screen that can be hung on the
living-room wall like a painting."
 1968: French scientist Pierre-Gilles de
Gennes carries out groundbreaking research
into phase transitions involving liquid crystals,
for which he wins the Nobel Prize in Physics in
1991.
 1969: RCA's Wolfgang Helfrich develops
twisted nematic LCDs based on polarized
light, but the company is skeptical and shies
away from developing them. At Kent State
University, James Fergason develops and
patents an alternative version of the same
idea. Today, Helfrich, his collaborator Martin
Schadt, and Fergason are jointly credited with
inventing the modern LCD.
 1970: Having failed to commercialize the LCD,
RCA sells its technology to Timex, which
popularizes LCDs in the first digital
wristwatches.
 1973: Sharp unveils the world's first LCD
pocket calculator (the EL-805).
 1980: STN (super twisted nematic) displays
appear, with far more pixels offering higher
resolution images.
 1988: 100 years after the discovery of liquid
crystals, Sharp sounds the death knell for
cathode-ray tubes when it produces the first
14-inch color TV with a TFT (thin-film
transistor) LCD display.
 2010s: Optical scientists experiment with more
efficient, richer-colored LCDs based
on quantum dots.

OLEDs (Organic LEDs) and LEPs (light-


emitting polymers)

by Chris Woodford. Last updated: May 17, 2021.

D o you remember old-style TVs powered by cathode-ray tubes

(CRTs)? The biggest ones were about 30–60cm (1–2ft) deep and almost too
heavy to lift by yourself. If you think that's bad, you should have seen what
TVs were like in the 1940s. The CRTs inside were so long that they had to
stand upright firing their picture toward the ceiling, with a little mirror at the top
to bend it sideways into the room. Watching TV in those days was a bit like
staring down the periscope of a submarine! Thank goodness for progress.
Now most of us have computers and TVs with LCD screens, which are thin
enough to mount on a wall, and displays light enough to build into portable
gadgets like cellphones. But displays made with OLED (organic light-
emitting diode) technology are even better. They're super-light, almost
paper-thin, theoretically flexible enough to print onto clothing, and they
produce a brighter and more colorful picture. What are they and how do they
work? Let's take a closer look!
Photo: OLED technology promises thinner, brighter, more colorful TV sets—even with curved screens. Photo of
a curved, Samsung UHD OLED TV by courtesy of Kārlis Dambrāns published on Flickr under a Creative
Commons (CC BY 2.0) Licence.

Contents

1. What is an LED?
2. How does an ordinary LED work?
3. How does an OLED work?
4. How an OLED emits light
5. Types of OLEDs
6. Advantages and disadvantages of OLEDs
7. What are OLEDs used for?
8. Who invented OLEDs?
9. Find out more

What is an LED?

Photo: LEDs on an electronic instrument panel. They make light by the controlled movement of electrons, not
by heating up a wire filament. That's why LEDs use much less energy than conventional lamps.
LEDs (light-emitting diodes) are the tiny, colored, indicator lights you see on
electronic instrument panels. They're much smaller, more energy-efficient,
and more reliable than old-style incandescent lamps. Instead of
making light by heating a wire filament till it glows white hot (which is how a
normal lamp works), they give off light when electrons zap through the
specially treated ("doped") solid materials from which they're made.

An OLED is simply an LED where the light is produced ("emitted") by organic


molecules. When people talk about organic things these days, they're usually
referring to food and clothing produced in an environmentally friendly way
without the use of pesticides. But when it comes to the chemistry of how
molecules are made, the word has a completely different meaning. Organic
molecules are simply ones based around lines or rings of carbon atoms,
including such common things as sugar, gasoline, alcohol, wood, and plastics.

Sponsored links

How does an ordinary LED work?


Before you can understand an OLED, it helps if you understand how a
conventional LED works—so here's a quick recap. Take two slabs of
semiconductor material (something like silicon or germanium), one slightly rich
in electrons (called n-type) and one slightly poor in electrons (if you prefer,
that's the same as saying it's rich in "holes" where electrons should be, which
is called p-type). Join the n-type and p-type slabs together and, where they
meet, you get a kind of neutral, no-man's land forming at the junction where
surplus electrons and holes cross over and cancel one another out. Now
connect electrical contacts to the two slabs and switch on the power. If you
wire the contacts one way, electrons flow across the junction from the rich
side to the poor, while holes flow the other way, and a current flows across the
junction and through your circuit. Wire the contacts the other way and the
electrons and holes won't cross over; no current flows at all. What you've
made here is called a junction diode: an electronic one-way-street that
allows current to flow in one direction only. We explain all this more clearly
and in much more detail in our main article on diodes.
Artwork: A junction diode allows current to flow when electrons (black dots) and holes (white dots) move across
the boundary between n-type (red) and p-type (blue) semiconductor material.

An LED is a junction diode with an added feature: it makes light. Every time
electrons cross the junction, they nip into holes on the other side, release
surplus energy, and give off a quick flash of light. All those flashes produce
the dull, continuous glow for which LEDs are famous.

How does an OLED work?

Artwork: The arrangement of layers in a simple OLED.

OLEDs work in a similar way to conventional diodes and LEDs, but instead of
using layers of n-type and p-type semiconductors, they use organic molecules
to produce their electrons and holes. A simple OLED is made up of six
different layers. On the top and bottom there are layers of
protective glass or plastic. The top layer is called the seal and the bottom
layer the substrate. In between those layers, there's a negative
terminal (sometimes called the cathode) and a positive terminal (called the
anode). Finally, in between the anode and cathode are two layers made from
organic molecules called the emissive layer (where the light is produced,
which is next to the cathode) and the conductive layer (next to the anode).

How an OLED emits light


How does this sandwich of layers make light?

1. To make an OLED light up, we simply attach a


voltage (potential difference) across the anode
and cathode.
2. As the electricity starts to flow, the cathode
receives electrons from the power source and
the anode loses them (or it "receives holes," if
you prefer to look at it that way).
3. Now we have a situation where the added
electrons are making the emissive layer
negatively charged (similar to the n-type layer
in a junction diode), while the conductive layer
is becoming positively charged (similar to p-
type material).
4. Positive holes are much more mobile than
negative electrons so they jump across the
boundary from the conductive layer to the
emissive layer. When a hole (a lack of
electron) meets an electron, the two things
cancel out and release a brief burst of energy
in the form of a particle of light—a photon, in
other words. This process is
called recombination, and because it's
happening many times a second the OLED
produces continuous light for as long as the
current keeps flowing.

We can make an OLED produce colored light by adding a colored filter into
our plastic sandwich just beneath the glass or plastic top or bottom layer. If we
put thousands of red, green, and blue OLEDs next to one another and switch
them on and off independently, they work like the pixels in a conventional LCD
screen, so we can produce complex, hi-resolution colored pictures.

Types of OLEDs
There are two different types of OLED. Traditional OLEDs use small organic
molecules deposited on glass to produce light. The other type of OLED uses
large plastic molecules called polymers. Those OLEDs are called light-
emitting polymers (LEPs) or, sometimes, polymer LEDs (PLEDs). Since
they're printed onto plastic (often using a modified, high-precision version of
an inkjet printer) rather than on glass, they are thinner and more flexible.

Photo: In OLEDs, thin polymers turn electricity into light. Polymers can also work in the opposite way to convert
light into electricity, as in polymer solar cells like these. Photo by Jack Dempsey courtesy of US
DOE/NREL (US Department of Energy/National Renewable Energy Laboratory).

OLED displays can be built in various different ways. In some designs, light is
designed to emerge from the glass seal at the top; others send their light
through the substrate at the bottom. Large displays also differ in the way
pixels are built up from individual OLED elements. In some, the red, green,
and blue pixels are arranged side by side; in others, the pixels are stacked on
top of one another so you get more pixels packed into each square
centimeter/inch of display and higher resolution (though the display is
correspondingly thicker).

Advantages and disadvantages of OLEDs

Photo: TVs, computer monitors, and mobile devices (laptops and tablets) are gradually becoming thinner
thanks to OLED technology. Photo courtesy of LG Electronics published on Flickr in 2009 under a Creative
Commons Licence.

OLEDs are superior to LCDs in many ways. Their biggest advantage is that
they're much thinner (around 0.2–0.3mm or about 8 thousandths of an inch,
compared to LCDs, which are typically at least 10 times thicker) and
consequently lighter and much more flexible. They're brighter and need no
backlight, so they consume much less energy than LCDs (that translates into
longer battery life in portable devices such as cellphones and MP3 players).
Where LCDs are relatively slow to refresh (often a problem when it comes to
fast-moving pictures such as sports on TV or computer games), OLEDs
respond up to 200 times faster. They produce truer colors (and a true black)
through a much bigger viewing angle (unlike LCDs, where the colors darken
and disappear if you look to one side). Being much simpler, OLEDs should
eventually be cheaper to make than LCDs (though being newer and less well-
adopted, the technology is currently much more expensive).

As for drawbacks, one widely cited problem is that OLED displays don't last as
long: degradation of the organic molecules meant that early versions of
OLEDs tended to wear out around four times faster than conventional LCDs
or LED displays. Manufacturers have been working hard to address this and
it's much less of a problem than it used to be. Another difficulty is that organic
molecules in OLEDs are very sensitive to water. Though that shouldn't be a
problem for domestic products such as TV sets and home computers, it might
present more of a challenge in portable products such as cellphones.

What are OLEDs used for?

Photo: TVs and phones are still the most familiar application of OLEDs—but expect many more things to follow
as prices become increasingly competitive with older technologies such as LCD. Photo of a curved LG OLED
TV by courtesy of Kārlis Dambrāns published on Flickr under a Creative Commons (CC BY 2.0) Licence.

OLED technology is still relatively new compared to similar, long-established


technologies such as LCD. Broadly speaking, you can use OLED displays
wherever you can use LCDs, in such things as TV and computer screens and
MP3 and cellphone displays. Their thinness, greater brightness, and better
color reproduction suggests they'll find many other exciting applications in
future. They might be used to make inexpensive, animated billboards, for
example. Or super-thin pages for electronic books and magazines. How about
paintings on your wall you can update from your computer? Tablet computers
with folding displays that neatly transform into pocket-sized smartphones? Or
even clothes with constantly changing colors and patterns wired to visualizer
software running from your iPod!

Samsung started using OLED technology in its TVs back in 2013, and in its
Galaxy smartphones the following year. Apple, originally dominant in the
smartphone market, has lagged badly behind in OLED technology until quite
recently. In 2015, after months of rumors, the hotly anticipated Apple Watch
was released with an OLED display. Since it was bonded to high-strength
glass, Apple was presumably less interested in the fact that OLEDs are
flexible than that they're thinner (allowing room for other components) and
consume less power than LCDs, offering significantly longer battery life. In
2017, the iPhone X became the first Apple smartphone with an OLED display.

Despite the hype, consumers were originally less enthusiastic about mobiles
and TVs with OLED screens, largely because LCDs were much cheaper and
a tried and trusted technology. That's no longer really true, certainly not of
TVs: prices of OLED kit have fallen dramatically, with some OLED TVs on
sale in 2020/2021 going for about half the price that they were just a year or
two earlier. Where phones are concerned, the advantages of OLEDs—
(arguably) better display quality, improved battery life, lighter weight, and
thinness/flexibility—often outweigh any simple cost difference. In a telling
2020 analysis, Ross Young of Display Supply Chain Consultants noted a
steady shift from LCD as Asian manufacturers switch production to OLEDs
and new technologies such as 5G wireless become increasingly important.
Young forecasts that OLEDs will account for just over a half (54.5 percent) of
the smartphone display market by 2025, compared to just under a quarter
(23.9 percent) in 2016.

Who invented OLEDs?


Organic semiconductors were discovered in the mid-1970s by Alan Heeger,
Alan MacDiarmid, and Hideki Shirakawa, who shared the Nobel Prize in
Chemistry in 2000 for their work. The first efficient OLED—described as "a
novel electroluminescent device... constructed using organic materials as the
emitting elements"—was developed by Ching Tang and Steven VanSlyke,
then working in the research labs at Eastman Kodak, in 1987. Their work,
though novel, built on earlier research into electroluminescence, which was
first reported in organic molecules by a French physicist named André
Bernanose in the 1950s. He and his colleagues applied high-voltage AC
(alternating current) electric fields to thin films of cellulose and cellophane
"doped" with acridine orange (a fluorescent, organic dye). By 1970, Digby
Williams and Martin Schadt had managed to create what they called "a simple
organic electroluminescent diode" using anthracene, but it wasn't until Tang
and VanSlyke's work, in the 1980s, that OLED technology became truly
practical.

Milestones in the development of OLEDs since then have included the first
commercial OLED (Pioneer, 1997), the first full-sized OLED display (Sony,
2001), the first OLED mobile phone display (Samsung, 2007), commercial
OLED lighting systems (Lumiotec, 2013), and large-screen commercial OLED
TVs (by Samsung, LG, Panasonic, Sony, and others in 2012 and 2013).[1]

You might also like