Welcome to Scribd, the world's digital library. Read, publish, and share books and documents. See more
Standard view
Full view
of .
Look up keyword
Like this
0 of .
Results for:
No results containing your search query
P. 1


Ratings: (0)|Views: 110|Likes:
Published by api-20008301

More info:

Published by: api-20008301 on Dec 04, 2009
Copyright:Attribution Non-commercial


Read on Scribd mobile: iPhone, iPad and Android.
download as PDF, TXT or read online from Scribd
See more
See less





Seminar 2004
1Quantum computers
The acceleration of computing power never seems slow. With each blink of an eye,
there\u2019s a new, faster processor or a better data-storage-intensive hard drive. But for all their
computational might, computers as we know them will eventually bump up against the laws of
physics.Technology marches on resolutely, shrinking electronic components and cramming more

circuitry onto smaller and smaller wafers of silicon. If the current rate of miniaturization
continues, computer experts predict that within a decade or two, transistors will dwindle to the
size of an atom. But at those dimensions, well-behaved, predictable classical behavior goes out
the window, and the slippery, untenable nature of quantum mechanics takes over. In the quantum
world, rather than being entities with sharply defined positions and motions, particles are
described by spread-out wavefunctions, seemingly existing in many places at once.

So it might seem that the power of computers is destined to reach a limit. But scientists
usually don't take such pronouncements at face value--in this case, they have long been aware of
a way around this apparent constraint. For within the shadowy quantum world there is more
potential computing power than the speediest processor could ever dream of. That power stems
from quantum particles' capability for existing in more than one state, as well as their ability to
become inextricably linked to each other by a phenomenon known as entanglement.

Traditional computers perform calculations, however quickly, in a basically sequential
manner. Their limitations surface in the simple, yet striking, example of factoring a large
number. The time a computer spends searching for a number's factors increases astronomically
with the size of the number. To factor a 400-digit number, for example, would take a modern
computer billions of years.

On the other hand, a computer made of quantum particles has a built-in parallelism
because quantum calculations can be performed on the particles' coexisting states
simultaneously. A quantum computer, then, might factor that 400-digit number in minutes. Such
a completely different approach to computing, it seems, truly earns the designation "paradigm

Seminar 2004
2Quantum computers
In 1965 Intel co-founder Gordan Moore noted that processing power (number of
transistors and speed) of computer chips was doubling each 18 months or so. This trend has
continued for nearly 4 decades. The basic processing unit in a computer chip is the transistor
which acts like a small switch. The binary digits 0 and 1 are represented by the transistor being
turned off or on.
Currently thousands of electrons are used to drive each transistor. As the processing
power increases, the size of each transistor reduces. If Moore's law continues unabated, then

each transistor is predicted to be as small as a hydrogen atom by about 2030. At the size the
quantum nature of electrons in the atoms becomes significant. It generates errors in the

3Quantum computers

However, rather than be a hindrance, it is possible to exploit the quantum physics as a new way to do computation. And this new way opens up fantastic new computational power based on the wave nature of quantum particles.

We normally think of electrons, atoms and molecules as particles. But each of these
objects can also behave as waves. This dual particle-wave behaviour was first suggested in the
1920's by Louis de Broglie.
This concept emerged as follows. Thomas Young's experiment with double slits in the
early 1800\u2019s show that light behaves as if it is a wave. But, strikingly, Einstein's explanation of
the photoelectric effect in 1905 shows that light consists of particles. In 1923
de Broglie

suggested this dual particle-wave property might apply to all particles including electrons. Then in 1926 Davisson and Germer found that electrons scattered off a crystal of nickel behaved as if they were waves. Since then neutrons, atoms and even molecules have been shown to behave as waves. The waves tell us where the particle is likely to be found.

This dual particle-wave property is exploited in quantum computing in the following
way. A wave is spread out in space. In particular, a wave can spread out over two different
places at once. This means that a particle can also exist at two places at once. This concept is
called thesu p e rp osit ion principle - the particle can be in a superposition of two places.

Activity (6)

You've already reviewed this. Edit your review.
1 hundred reads
zabrax liked this
jugpreet89 liked this
bljohnson101 liked this
Anupriya Nair liked this
Deepak Natarajan liked this

You're Reading a Free Preview

/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->