You are on page 1of 4

Zeroes, ones, and both

To get to grips with quantum computing, first remember that an ordinary computer
works on 0s and 1s. Whatever task you want it to perform, whether it's calculating a
sum or booking a holiday, the underlying process is always the same: an instance of
the task is translated into a string of 0s and 1s (the input), which is then processed
by an algorithm. A new string of 0s and 1s pops out at the end (the output), which
encodes the result. However clever an algorithm might appear, all it ever does is
manipulate strings of bits — where each bit is either a 0 or a 1. On the machine
level, this either/or dichotomy is represented using electrical circuits which can
either be closed, in which case a current flows, or open, in which case there isn't a
current.

The idea of superposition led the physicist Erwin Schrödinger to speculate that a cat in a box could be both
dead and alive as long as you don't look at it. (This cat is definitely alive.)
Quantum computing is based on the fact that, in the microscopic world, things don't
have to be as clear-cut as we'd expect from our macroscopic experience. Tiny
particles, such as electrons or photons, can simultaneously take on states that we
would normally deem mutually exclusive. They can be in several places at once, for
example, and in the case of photons simultaneously exhibit two kinds of polarisation.
We never see this superposition of different states in ordinary life because it
somehow disappears once a system is observed: when you measure the location of
an electron or the polarisation of a photon, all but one of the possible alternatives
are eliminated and you will see just one. Nobody knows how that happens, but it
does. (You can find out more in Schrödinger's equation — what is it?)

Superposition frees us of from binary constraints. A quantum computer works with


particles that can be in superposition. Rather than representing bits — such particles
would represent qubits, which can take on the value 0, or 1, or both simultaneously.
"If you do something to [such a quantum system], it's as though you are doing it
simultaneously to 0 and to 1," explains Richard Jozsa, a pioneer of quantum
computing at the University of Cambridge.

Spooky action
You might object that something like
Submitted by Marianne on October 1, 2015

This article is part of our Information about information project, run in collaboration
with FQXi. Click here to read other articles on quantum computing.

Quantum computing often grabs the headlines. The word "quantum" itself is
intriguing enough, and combined with the promise of computational power that
surpasses anything we have seen so far it becomes irresistible. But what exactly is
quantum computing?

Zeroes, ones, and both


To get to grips with quantum computing, first remember that an ordinary computer
works on 0s and 1s. Whatever task you want it to perform, whether it's calculating a
sum or booking a holiday, the underlying process is always the same: an instance of
the task is translated into a string of 0s and 1s (the input), which is then processed
by an algorithm. A new string of 0s and 1s pops out at the end (the output), which
encodes the result. However clever an algorithm might appear, all it ever does is
manipulate strings of bits — where each bit is either a 0 or a 1. On the machine
level, this either/or dichotomy is represented using electrical circuits which can
either be closed, in which case a current flows, or open, in which case there isn't a
current.

The idea of superposition led the physicist Erwin Schrödinger to speculate that a cat in a box could be both
dead and alive as long as you don't look at it. (This cat is definitely alive.)
Quantum computing is based on the fact that, in the microscopic world, things don't
have to be as clear-cut as we'd expect from our macroscopic experience. Tiny
particles, such as electrons or photons, can simultaneously take on states that we
would normally deem mutually exclusive. They can be in several places at once, for
example, and in the case of photons simultaneously exhibit two kinds of polarisation.
We never see this superposition of different states in ordinary life because it
somehow disappears once a system is observed: when you measure the location of
an electron or the polarisation of a photon, all but one of the possible alternatives
are eliminated and you will see just one. Nobody knows how that happens, but it
does. (You can find out more in Schrödinger's equation — what is it?)

Superposition frees us of from binary constraints. A quantum computer works with


particles that can be in superposition. Rather than representing bits — such particles
would represent qubits, which can take on the value 0, or 1, or both simultaneously.
"If you do something to [such a quantum system], it's as though you are doing it
simultaneously to 0 and to 1," explains Richard Jozsa, a pioneer of quantum
computing at the University of Cambridge.

What is Quantum Computing?

A conventional computer uses the ‘long bit’, a basic unit of information that contains
a single binary value of ‘0’ or ‘1’. In contrast quantum computers use qubits, which
can hold either value, or both values at the same time. This is due to the quantum
physics phenomenon know as ‘superposition’. Superposition occurs when two or
more quantum states of a subatomic particle, such as the spin states of ‘spin-up’ and
‘spin-down’, are held at the same time. For the purposes of quantum computing,
superposition can be interpreted as a qubit being both a ‘0’ and a ‘1’.

The use of subatomic particles that can exist in multiple states allows for computing
to be done much faster, using less energy than conventional computers. For
instance, NASA scientists created a quantum computer in 2016 that was a 100
million times faster than a conventional computer.

If Google and the team at the University of California Santa Barbara are on track in
creating a 50-qubit chip, the quantum supremacy could be just months away.

Related Stories

 Using Scanning Microwave Impedance Microscopy for Nanoscale Mapping of


Permittivity and Conductivity

 Modern Methods of Particle Characterization - Supplier Data

 Oxford Instruments participates in the launch of the European Quantum Technology


Flagship Programme ‘QMiCS’

“The computational power of a quantum computer grows exponentially with the


number of quantum bits manipulated”, says Alexey Kavokin of the Russian Quantum
Centre (RQC).
Benefits of Quantum Computing

The performance benefits of this type of computing power come from its ability to
analyze quickly and test extremely large datasets. Below are some examples of how
quantum computing will impact our world

You might also like