Professional Documents
Culture Documents
Sulthanpur(v),pulkal(M),Sangareddy-502293,Telangana
1
Jawaharlal Nehru Technological University Hyderabad
Sulthanpur(v),pulkal(M),Medak-502293 Telangana
CERTIFICATE
This is to certify that the technical seminar on is a
bonafied work carried out by in partial
fulfillment of the requirements for the degree of BACHELOR OF
TECHNOLOGY IN ELECTRONICS AND COMMUNICATION
ENGINEERING by the Jawaharlal Nehru Technological University,Hyderabad
during the academic year 2019-2020
Head Of Department
2
ACKNOWLEDGEMENT
By
MD.RAYAAN (16SS1A0429)
3
ABSTRACT
4
TABLE OF CONTENTS
CONTENTS Page No.
Certificate 1
Acknowledgement 2
Abstract 3
Chapter -1 Introduction 6
Chapter-2 History 9
6.2 Applications 19
6.4 Conclusion 24
References 25
5
CHAPTER 1
INTRODUCTION
Quantum computer is a device that can arbitrarily manipulate the quantum state
of apart of itself. The fled of quantum computation is largely a body of the
theoretical promises for some impressively fast algorithms which could be
executed on quantum computers. However, since the first significant algorithm
was proposed in 1994 experimental progress has been rapid with several schemes
yielding two and three quantum-bit manipulations
6
A schematic model of a quantum computer is described as well as some of the
subtleties in its programming. The Shor's algorithm for efficiently factoring
numbers on a quantum computer is presented in two parts: the quantum
procedure within the algorithm and the classical algorithm that calls the quantum
procedure. The mathematical structure within the factoring problem is discussed,
making it clear what contribution the quantum computer makes to the
calculation. The complexity of the Shor's algorithm is compared to that of
factoring on conventional machines and its relevance to public key cryptography
is noted. In addition, we discuss the experimental status of the field and also
quantum error correction which may in the long run help solve some the 2 most
pressing difficulties. We conclude with an outlook to the feasibility and prospects
for quantum computation in the coming years.
A qubit can exist not only in the states corresponding to the logical values O or
1 as in the case of a classical bit, but also in a superposition state.
A qubit is a bit of information that can be both zero and one simultaneously
(Superposition state). Thus, a computer working on a qubit rather than a
standard bit can make calculations using both values simultaneously. A qubyte is
made up of eight qubits and can have all values from zero to 255 simultaneously.
"Multi-qubyte systems have a power beyond anything possible with classical
computers." (Quantum Computers & Moore's Law, p. l).
7
Forty qubits could have the same power as modern supercomputers. According
to Chuang a supercomputer needs about a month to find a phone number from
the database consisting of world's phone books, where a quantum computer is
able to solve this task in 27 minutes.
8
CHAPTER 2
History
In 1982 Richard Feynman theorized that classic computation could be
dramatically improved by quantum effects, building on this, David Deutsch
developed the basis for quantum computing between 1984 and 1985. The next
major breakthrough came in 1994 when Peter Shor described a method to factor
large numbers in quantum poly-time (which breaks RSA encryption). This became
known as Shor's algorithm. At around the same time the quantum complexity
classes were developed and the quantum Turing machine was described.
Then in 1996 Lov Grover developed a fast database search algorithm (known as
Grover's algorithm). The first prototypes of quantum computers were also built in
1996. In 1997 quantum enor correction techniques were developed at Bell labs
and IBM. Physical implementations of quantum computers improved with a three
qubit machine in 1999 and a seven qubit machine in 2000.
9
CHAPTER 3
THEORY OF QUANTUM MECHANICS
In 1909, a few years after demonstrating the photoelectric effect, Einstein used
his photon hypothesis to obtain a simple derivation of Planck's black body
distribution. Planck himself had not gone as far as Einstein: he had indeed
assumed that the transfer of energy between matter (the oscillators in the
walls of the cavity) and radiation was quantized (i.e. the energy transferred
to/from an oscillator occurred in "grains" less than h times the frequency of the
oscillator).But Planck had assumed the energy in the electromagnetic field, in
the cavity, was continuously distributed, as in classical theory. By contrast, it
was Einstein's hypothesis that the energy in the field itself was quantized: that
for certain purposes, the field behaved like an ideal gas, not of molecules, but
of photons, each with energy h times frequency, with the number of photons
being proportional to the intensity. The clue to this was Einstein's observation
that the high frequency part of Planck' s distribution for black body
radiation(described by Wien's law) could be derived by assuming a gas of
photons and applying statistical mechanics to it. This was in contrast to the low
frequency part (described by the Rayleigh-Jeans law) which could be
successfully obtained using classical electromagnetic theory, i.e. assuming
waves. So you had both particles and waves playing a part. Funhermore,
Einstein looked at fluctuations of the energy about its average value, and
observed that the formula obtained had two forms, one which you would get if
light was made up of waves and the other if it was made up of particles. Hence
we have wave-particle duality.
In 1924, Louis de Broglie, 1892 - 1987extended the particle duality for light to all
matter. He stated: The motion of a particle of any sort is associated with the
propagation of a wave. This is the idea of a pilot wave which guides a free
particle's motion. De Broglie then suggested the idea of electron waves be
extended to bound particles in atoms, meaning electrons move around the
nucleus guided by pilot waves. So, again a duality, de Broglie waves and Bohr's
particles. de Broglie was able to show that Bohr' s orbital radii could be obtained
10
by fitting a whole number of waves around the nucleus. This gave an explanation
of Bohr's angular momentum quantum condition (see above).The new quantum
theory was developed between June 1925 and June1926. Werner Heisenberg,
1901 - 1976, using a totally different and more simple atomic model (one that did
not use orbits) worked out a code to connect quantum numbers and spectra. He
also discovered that quantum mechanics does not follow the commutative law of
multiplication i.e. pq qp. When Max Born, 1882 — 1970 saw this he suggested
that Heisenberg use matrices. This became matrix mechanics, eventually all the
spectral lines and quantum numbers were deduced for hydrogen. The first
complete version of quantum mechanics was born. It's interesting to note that it
was not observation, or visualization that was used to deduce to theory - but pure
mathematics. Later we will see matrices cropping up in quantum computing. At
around the same time Erwin Schrodinger, 1887 — 1961 built on de Broglie's work
on matter waves. He developed a wave equation (for which a is the solution) for
the core of bound electrons, as in the Hydrogen atom. It turns out that the results
derived from this equation agree with the Bohr model. He then showed that
Heisenberg's matrix mechanics and his wave mechanics were equivalent. Max
Born proposed that a, the solution to Schrödinger' s equation can be interpreted
as a probability amplitude, not a real, physical value. The probability amplitude is
a function of the electron's position (x; y; z) and, when Squared, gives the
probability of finding the electron in a unit volume at the point (x; y; z).This gives
us a new, probabilistic atomic model, in which there is a high probability that the
electron will be found in a particular orbital shell. A representation of the ground
state of hydrogen is shown in figure 4.16 and at the places where the density of
points is high there is a high probability of finding the particle. The linear nature
of the wave equation means that if VI and 11/2 are two solutions then so is VI +
11/2, a superposition state (we'll look at superposition soon). This probabilistic
interpretation of quantum mechanics implies the system is in both states until
measured. Now back to Niels Bohr. In 1927 Niels Bohr described the concept of
complementarity: it depends on what type of measurement operations you are
using to look at the system as to whether it behaves like a particle or a wave. He
then put together various aspects of the work by Heisenberg, Schrodinger, and
Born and concluded that the properties of a system (such as position and
11
momentum) are undefined having only potential values with certain probabilities
of being measured. This became known as the Copenhagen interpretation of
quantum mechanics. Einstein did not like the Copenhagen interpretation and, for
a good deal of time, Einstein kept trying to refute it by thought experiment, but
Bohr always had an answer. But in 1935 Einstein raised an issue that was to later
have profound implications for quantum computation and lead to the
phenomenon we now call entanglement, a concept we'll look at in a few pages.
12
CHAPTER 4
Linear Algebra
Quantum mechanics leans heavily on linear algebra. Some of the concepts of
quantum mechanics come from the mathematical formalism, not thought
experiments that's what can give rise to counter intuitive conclusions.
Superposition
Superposition means a system can be in two or more of its states
simultaneously. For example a single particle can be traveling along two
different paths at once. This implies that the particle has wave-like properties,
which can mean that the waves from the different paths can interfere with
each other. Interference can cause the particle to act in ways that are
impossible to explain without these wave-like properties.
The ability for the particle to be in a superposition is where we get the parallel
nature of quantum computing: If each of the states corresponds to a different
value then, if we have a superposition of such states and act on the system, we
effectively act on all the states simultaneously.
Dirac Notation
13
is in the state and the down electron is in state . But, if we take the up electron
and pass it through a horizontal field it comes out on one side50% of the time
and on the other side 50% of the time. If we represent these two states as and
can say that the up spin electron was in a superposition of the two states and
such that, when we make a measurement with the field horizontal we project
the electron into one or the other of the two states, with equal probabilities 1/2
given by the square of the amplitudes.
Representing Information
Quantum mechanical information can be physically realized in many ways. To
have something analogous to a classical bit we need a quantum mechanical
system with two states only, when measured. We have just seen two examples:
electron spin and photon direction. Two more methods for representing binary
information in a way that is capable of exhibiting quantum effects (e.g.
entanglement and superposition) are: polarization of photons and nuclear spins.
Uncertainty
The quantum world is irreduceble small so it's impossible to measure a quantum
system without having an effect on that system as our measurement devices also
quantum mechanical. As a result there is no way of accurately predicting all of
the properties of a particle. There is a trade off - the properties occurring
complementary pairs (like position and momentum, or vertical spin and
horizontal spin) and if we know one property with a high degree of certainty then
we must know almost nothing about the other property. That unknown
property's behaviour is essentially random. An example of this is a particle's
position and velocity: if we know exactly where it is then we know nothing about
how fast it is going. It has been postulated (and currently accepted) that particles
in fact DO NOT have defined values for unknown properties until they are
measured. This is like saying that something does not exist until it is looked at.
Entanglement
14
In 1935 Einstein (along with colleagues Podolskand Rosen) demonstrated
paradox (named EPR after them) in an attempt to refute the undefined nature of
quantum systems. The results of their experiment seemed to show that
quantum systems were defined, having local state before measurement.
Although the original hypothesis was later proven wrong (i.e. it was proven that
quantum systems do not have local state before measurement). The effect they
demonstrated was still important, and later became known as entanglement.
Entanglement is the ability for pairs of particles to interact over any distance
instantaneously. Particles don't exactly communicate, but there is a statistical
correlation between results of measurements on each particle that is hard to
understand using classical physics. To become entangled, two particles are
allowed to interact; they then separate and, on measuring say, the velocity of
one of them (regardless of the distance between them), we can be sure of the
value of velocity of the other one (before it is measured). The reason we say that
they communicate instantaneously is because they store no local state [Rae, A.
1996] and only have well defined state once they are measured. Because of this
limitation particles can't be used to transmit classical messages faster than the
speed of light as we only know the states upon measurement. Entanglement has
applications in a wide variety of quantum algorithms and machinery, some of
which we'll look at later. As stated before, it has been proven that entangled
particles have no local state.
The evolution of a closed system is a unitary transform. Say that, while the
system is evolving under its own steam - no measurement - the state at some
stage is related to the state at some previous stage (or time) by a unitary
15
transform = . This means that we can totally describe the behaviour of a system
by using unitary matrices.
16
CHAPTER 5
Bits and Qubits
This section is about the "nuts and bolts" of quantum computing. It describes
qubits, gates, and circuits.
Quantum computers perform operations on qubits which are analogous to
conventional bits (see below) but they have an additional property in that they
can be in a superposition. A quantum register with 3 qubits can store 8 numbers
in superposition simultaneously and a 250 qubit register holds more numbers
(superposed) than there are atoms in the universe!
Single Qubits:
Classical computers use two discrete states (e.g. states of charging of a capacitor)
to represent a unit of information, this state is called a binary digit (or bit for
short). A bit has the following two values:
There is no intermediate state between them, i.e. the value of the bit cannot be
in a superposition.
17
Quantum bits, or qubits, can on the other hand be in a state 'between" O and l,
but only during the computational phase of a quantum operation. When
measured, a qubit can become either:
The symbolic notation is part of the Dirac notation . In terms of the above it
essentially means the same thing as O and 1, just like a classical bit. Generally, a
qubit's state during the computational phase is represented by a linear
combination of states otherwise called a superposition state.
Here are the probability amplitudes. They can be used to calculate the
probabilities of the system jumping into jOi or jli following a measurement or
readout operation. There may be, say a 25% chance is measured and a 75%
chance a 1 is measured. The percentages must add to 100%. In terms of their
representation qubits must satisfy: This the same things as saying the
probabilities add to 100%.Once the qubit is measured it will remain in that state
if the same measurement is repeated provided the system remains closed
between measurements. The probability that the qubit's state, when in a
superposition, will collapse to states is are actually vectors, they are called the
computational basis states that form an orthonormal basis for the vector
spaceC2.
CHAPTER 6
18
ADVANTAGES AND DISADVANTAGES
ADVANTAGE
The main advantage of quantum computing is it can execute any task very faster
when compared to the classical computer, generally the atoms changes very
faster in case of the traditional computing whereas in quantum computing it
changes even more faster. But all the tasks can’t be done better by quantum
computing when compared to traditional computer.
DISADVANTAGE
The main disadvantage of computing is the technology required to
implement a quantum computer is not available at present. The reason for this
is the consistent electron is damaged as soon as it is affected by its
environment and that electron is very much essential for the functioning of
quantum computers.
The research for this problem is still continuing the effort applied to
identify a solution for this problem has no positive progress.
19
6.1 APPLICATIONS
One of the areas that I have been researching is what applications can best make
use of the power of quantum computing. Although this is a work in progress, I
am providing a preliminary assessment for my readers based upon discussions
with various experts and other research I have done so far. The list below is
shown in a priority order based upon the combination of three factors that I have
reviewed: Progress-to-Date, Difficulty, and Payoff. One thing to note is that the
successful implementations for most, if not all, of these application areas will
probably be based upon a hybrid platform that combines classical and quantum
computing in a cloud environment to achieve the best of both worlds. So here’s
the list.
Machine Learning
Machine learning is a hot area right now because we are now seeing significant
deployments at the consumer level of many different platforms. We are now
seeing aspects of this every day in voice, image and handwriting recognition, to
name just a few examples. But it is also a difficult and computationally expensive
task, particularly if you want to achieve good accuracy. Because of the potential
payoff, there is a lot of research ongoing based upon sampling of Boltzmann
distributions.
Computational Chemistry
There are many problems in materials science that can achieve a huge payoff if
we just find the right catalyst or process to develop a new material, or an existing
material more efficiently. There is already a significant effort in using classical
computers to simulate chemical interactions, but in many cases the problems
become intractable for solving classically. So the original idea presented by
Richard Feynman is why not use a quantum computer to simulate the quantum
mechanical processes that occur. Here are just a few examples of significant
problems that could see large payoffs if we can solve them.
20
Replace the Haber process to produce ammonia for use in fertilizers
Develop a new battery chemistry that can significantly improve the performance
over today’s lithium-ion batteries
Drug Design
Although drug design is really a problem in computational chemistry, I put it
into its own classification because it is used by the pharmaceutical industry.
21
Many of the drugs being developed still do so through the trial and error
method. This is very expensive and if more effective ways of simulating how a
drug will react would save a ton of money and time.
Cyber Security
Cyber security is becoming a larger issue every day as threats around the world
are increasing their capabilities and we become more vulnerable as we increase
our dependence upon digital systems, learn more about cybersecurity in 2019
over at sites such as upskilled and others. Various techniques to combat cyber
security threats can be developed using some of the quantum machine learning
approaches mentioned above to recognize the threats earlier and mitigate the
damage that they may do.
Codebreaking
You may wonder why I have put codebreaking so far down the list, given all the
attention given to Shor’s algorithm and its ability to factor large numbers and
break RSA encryption. The reason is that I believe that this will just be a
temporary application until the world converts to a class of “post-quantum”
cryptographic techniques that will not be vulnerable to breaking by a quantum
computer. There is an increasing amount of research in post-quantum
cryptography that you can review here and here. So although, we probably will
have quantum computers able to factor very large numbers 10 years from now, it
is not clear if we will have a use for it at that time.
22
be very high because some of these systems can be used where lives or millions
of dollars might be dependent on their being error-free. By using quantum
computing to help in these simulations, one can potentially provide a much better
coverage in their simulations with a greatly improved time to do so.
23
6.3 FUTURE SCOPE
Due to its precise and accurate calculations, the demand of quantum computers
has progressively increased. Many enterprises have started investing in quantum
computers insteead of conventional Super Computers. One such company that
has been dedicated to working on the concept of quantum computing is “D-
wave”. They have successfully developed a 512 qubit chipset. Dimensionally, the
chipset is as big as a bedroom. This is required in order to maintain the required
superconducting conditions. It is believed this chipset will shrink to the size of a
desktop CPU in the near future.
The major reason for the success of quantum computers over super computers is
the flawless security provided by the quantum computers. Many governments
have already installed quantum computers for military and medical security.
PROCESSING
Conventional computers store information in the form of bits and bytes and the
scope is only limited to 0s and 1s, thus flexibility in conventional computers is
restrained. On the other hand, Quantum computers use a totally different form of
data storage called “Qubits”. Qubits are not only restrained to values such as 0
and 1, in fact they also provide a possibility of the information being intermediate
(Superposition) between 0 and 1. Thus, making them more flexible and reliable
for massive calculations. For better understanding about the qubits, these are
various but stable energy levels of the sub-atomic particles or photons in three-
dimensional space. Each sub-atomic particle in the collection of qubits can be
used to represent a variety of information or data.
This flexibility can be explained by the concept of superposition of various energy
states. Qubits also increase the processing speed of the computers and display
results at the rate of nanoseconds (average). Sometimes it is also possible to
obtain the results at an even faster rate i.e. at picoseconds. The laws of Quantum
physics, which govern the microscopic world, allow bits of mater to be in two
states simultaneously. All modern-day computing relies on the ultrafast
manipulation of billions of bits of information. Quantum Computing combines
these two concepts, allowing the computers to put bits of information into their 0
and 1 state simultaneously (Superposition), thus making the quantum
computation powerful and fast.
24
6.4 CONCLUSION
It is important that making a practical quantum computing is still far in the future.
Programming style for a quantum computer will also be quite different.
Development of quantum computer needs a lot of money. Even the best
scientists can't answer a lot of questions about quantum physics. Quantum
computer is based on theoretical physics and some experiments are
already made. Building a practical quantum computer is just a matter of
time.
Quantum computers easily solve applications that can't be done with help of
today's computers. This will be one of the biggest steps in science and will
undoubtedly revolutionize the practical computing world.
REFERENCES
25
www.ibm.com/blogs/research/2018/10/quantum.
www.sciencedaily.com/releases/2018/10/
www.wired.co.uk/article/quantum-computing-explained
www.dwavesys.com
www.technologyreview.com/s/612844/what-is.
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45