You are on page 1of 10

Quantum Computing 1

Quantum Computing

BJ Jackson

Saint Leo University


Quantum Computing 2

Quantum Computing

The Revolutionary Problem

On one humid spring of 1965, in dingy office scattered with newsprint magazines, the

boldest statement in the future of computing was made. It was here in the small Fairchild

Semiconductor office that the future Intel co-founder Gordon Moore made a statement that

amounts to the closest thing to a prophecy modern science can get. He foresaw that computing

power would basically double every two years in the age of microprocessor devices. It is one of

the wonders in all of the modern computing industry that this concept still holds true to this day.

His statement became the golden standard of R&D of most IT companies today, and the veracity

of this statement has even earned itself a name: Moore’s Law. In the field of science, laws are

regarded as the most permanent things in systematic thought, infallible and unbreakable.

However, today Moore’s Law is in trouble. The problem with Mr. Moore’s statement is that his

observations were made in regards to the increase in transistor density that can be achieved on a

silicon wafer microprocessor. Transistor density has improved so much since the 40 years

Moore’s Law was formulated that chip manufacturers are now running into the limits of atomic

physics itself. Transistors can only be made so small until a point is reached where the gap

distance needed for the electron channel to pass through becomes less than the distance where

atomic interference will mar the readings. This means that the famed transistor, the marvel of

engineering that drove much of the progress in the 20th century, will hit its limit. The unpleasant

truth is that this day is fast approaching. In fact, last year Intel and MIT posited that this trend

can only continue for another five years (Simonite, 2017). After that, or without a major

breakthrough, the computing power revolution will malaise to a standstill. What then is the

solution? The major players in todays IT market to include IBM, Intel, and Google believe this
Quantum Computing 3

future lies in the realm of quantum computing. With billions of dollars already being investing in

this emerging technology, it is safe to say the next big breakthrough will arise from the research

that is currently ongoing in this field.

The question of what this new “quantum” world is has been the subject of much of the

newfound popular scientific media, with images of a dead/alive cat-in-a-box, artistic depiction of

sub-atomic particles, and portrayals of long, and complex formulas scrawled in chalk on green

chalkboards. The truth of what quantum mechanics, which underlies quantum computing, is

about is actually much simpler than what is appears in popular media. In its most basic sense it is

an explanation to the “spooky action at a distance” (Einstein 1930), or the seeming odd behavior

that particles make when interacting with each other. It seems to transcend the conventional

notion of logic itself. This intuition defying behavior is what drives the difficulty of most modern

scholars in the understanding and study of quantum behavior. As a result, the most in-depth

research on the subject has been done by pure physicists as they have the best toolset to deal with

the theory and mathematics behind it. Other disciplines have stayed away from delving into the

quantum world until very recently. The problem is one of institutionalized education curriculum

according to a recent analysis of barriers into quantum computing (Singh, 2007). By its very

nature of being an emerging discipline, they concepts that need to be learned are

interdisciplinary. There is a lack of emphasis on certain concepts that are important in the realm

of quantum computing but are secondary at best in the mainstream quantum fields. As a result,

even though quantum theory is not a new pursuit it has been progressed by physicists alone and

then only applied to other pursuits, such as computing, by eclectic physicists who dabble in other

fields of study (Merali, 2015). This is in my opinion why emerging applications in quantum
Quantum Computing 4

theory are slow to develop. It is only now that dedicated computer scientists and logicians are

delving into the quantum field.

Basic theory

The inexperience and complexity of the quantum world limits the research of quantum

applications in computing to two basic principles of quantum theory: superposition and quantum

entanglement. To understand the concept of superposition we must first understand its opposite:

discrete states. Take a transistor for example, it has two discrete states: off and on. If the

transistor is off, it is off when we measure it and when we don’t. An off transistor is off no

matter how many times we measure. If we measure something else, like its temperature, that

measurement will never turn the resistor on. Likewise for the “on” transistor. This is an example

of a discrete state, something that is always one way or the other. This is how us humans

measure things naturally, it is intuitive and easy to understand. My light in my bedroom is either

on or off. My grades are either passing or failing. For our purposes we can think of a bit being a

discrete piece of information, either on or off, 1 or 2. We can do operations knowing that this

will be true. However, the quantum world gives us another element: superposition. A quantum

bit would have discrete states as well, a definite 1 or 0, however it can also be both 1 and 0 at the

same time. It is not in any definite state, and it is not because of any uncertainty we might have.

It is not a logical construct, but rather a physical one, there is definitely a state and that state is

both 1 and 0. The proper way to say this fact would be that the quantum bit – or qubit - would be

in a superposition of 1 and 0. This is highly counterintuitive as there is no analogue in our world

where this occurs. Nevertheless, it has been scientifically proven to be true, even for

macroscopic objects like a 20kg mirror (Abbott et al., 2009). This features prominently in the

next concept important in quantum computing: entanglement. With a normal bit each bit in a
Quantum Computing 5

byte, for example, is completely independent of another. The state of one bit, whether 1 or 0, is

the same regardless of the other bits. The state of any bit in the byte will also not change since it

is discrete, even if another bit is changed. In this way each bit is like its own object: you can take

it away from the byte group and it will still retain its identity. In a quantum qubit, this is not the

case. Each quantum bit can become entangled with one another, meaning a single qubit “value”

is dependent on one or more qubits in the group. An entangled qubit cannot be taken away from

the group without destroying its “value” even though that value information is encoded on the

qubit itself. The basic theory behind all quantum computing is based off of these simple rules,

however the implications are profound (Research, 2017).

The major benefit this has is the fact that information in a qubit is no longer stored on the

bit itself but rather the information between the bit groups. A good analogy would be that a

classical bit in groups reads like a book. As you read through the pages the letters tell you a

story, the more pages the bigger the story. A quantum book would be an interesting read. As you

open the book you find that you cannot read anything as the figures on the page are

unintelligible. To read each character you must reference another character(s) on any number of

pages. In fact, a truly quantum book would have every character entangled with every other

character. You would have to read the book all at once to understand even the first character in

the book, since the information (or concepts) is contained in the relationship between the

characters and not the characters themselves. It may sound futile, but 300 characters in a

“normal” book might be able to describe an introductory paragraph, or 2400 bits of information.

Transform those normal characters into qubits however, and we would need more classical bits

than the total number of atoms in the known universe in order to be able to describe its
Quantum Computing 6

information capacity. ("Quantum Computing and the Entanglement Frontier" John Preskill,

CalTech, 2017).

Application

With the theoretical generalizations in mind it is now possible to dive more in depth into

the application of why these two concepts of superposition and entanglement are revolutionary.

In the transition from theory to practice there also needs to be a conversion of theory into real

numbers. To do this effectively, Microsoft has popularized the vector method of expressing

quantum computing information (Helwer, 2018). A vector is a way of representing a set of data

1
in an easily readable format. We would represent a classical 0 bit for example in this way: ( ).
0

0
Inversely a classical 0 bit would be written as ( ). The reason this is done this way is so they can
1

be operated on mathematically. For example, the number 3 in binary would be “11”.

Mathematically we combine bits like this by multiplying in a special way, called the tensor

product. It is a mathematical operator that takes each term and multiplies them to the product of a

0
larger matrix. The same process described above would look like this: ( ) ⊗ ( ) = .  ( ) ⊗
1

0
0 0
( ) = . The 1 is in the last row which corresponds to 3 (the zero place is counted).The reason
1 0
1

why this is done is because each product matrix has one and only factor for each result. In other

words each vector product can be mapped in a list to see what bit values they correspond with.

With a little math one can see that by representing classical bits this way there will always only

by one row of the product matrix with a 1. This is a very important concept to show why

classical bits are discrete, as always having a definite value, and the vector technique
Quantum Computing 7

mathematically shows this. While it may seem esoteric, this method is essential in operations

important when it comes to logic gate application, which is essential for modern computing.

Logic gates are basic operators that govern every single operation inside a computer, both

classical and quantum. The vector notation outlined above can model these mathematically. Take

0
the Not gate for example. The Not gate simply flips the bit value of the target. So if there is a ( )
1
1 0 1
after the Not gate the value would be a ( ). The not vector factor looks like this: . There are
0 1 0

many other gates and they all can be modeled by a multiplicative vector. The problem lies in that

complex problems require many logic gates to solve procedurally, and the number of bits

exponentially increase the processing time (Yanofsky & Mannucci, 2013). This means problems

that might have a linear difficulty, such as factoring prime numbers, have an exponential

classical computing workload. This is called the problem of scale.

This problem of scale is why the application of superposition is so powerful. The qubit

operates along the same principles as discussed above except instead of the discrete 0 and 1s

vector they can be any number, even complex ones (Helwer, 2018). This is how superposition is

1
modeled. For examples common qubits can be modeled like normal bits such as ( ) or in a state
0

1
of superposition such as √ or . These have no binary analogue, and without using this
0

notation would be extremely difficult to express fully in conventional computing theory.

1
Additionally, once these qubits are measured they collapse into the conventional states of ( )
0

0
and ( ). The probability of which state the qubit resolve into is the square of each term. So for a
1
1 0
qubit ( ), the probability of it collapsing into ( ) is and the probability of ( )
0 1
Quantum Computing 8


is (Yanofsky & Mannucci, 2013). Thus would have a 50% probability of being a binary 0

and a 50% probability of being a binary 1. This allows for the ability to manipulate qubits by

using conventional logic gates, such as the bit flip gate to exhibit quantum behavior with the

addition of special gates.

These special gates is what gives quantum computing the edge over conventional

computers. Perhaps the most integral of these is the Hadamard Gate or H gate.(Research I,

2017). This gate puts bits in classical state in equal superposition, and vice versa. The bit flip and

the H gate operations alone have shown to be able to solve n type problems with one pass, as

opposed to classical computing which would take 2^n times as long (Helwer, 2018). This

potentially solves the problem of scale, such as factoring n-type prime numbers, the cornerstone

of advanced encryption security, such as RSA.

The final application to the theory discussed previously is quantum entanglement. When

the product vector cannot be factored into two matrices, the two qubits are said to be in an

entangled state. An example would be 0 . There is no way to factor this into four elements. Put
0

another way, there is no way to separate the two qubits: the information of each qubit is

meaningless, only together do they hold a value (Yanofsky & Mannucci, 2013). By doing this we

can have information inside the relationship between data as well as the data itself. The more

dependencies there are the more potential information carrying capacity the system has. This is

done by applying an H gate and another special gate, the cNot gate. Only two operations can

redefine an entire qubit group, increasing efficiency of information.


Quantum Computing 9

Conclusion

With just these two basic concepts of superposition and quantum entanglement, quantum

computing would completely revolutionize the modern world, solving issues that non-quantum

digital computers today find impossible. Prime factorization, problems-of-scale, and information

transmission lengths will all be made as easy to solve as 1+1 for computers today. Great strides

are already being made in transposing modern logic gates to their quantum counterparts. The

next step will to be to make uniquely quantum logical operands and algorithms. It will take some

time to be sure, and we are not anywhere near a robust enough hardware architecture to model

even a 100-qubit microprocessor, but strides are being made every year. Soon Moore’s Law will

be a thing of the past, and the future will be even brighter.


Quantum Computing
10

References:

Abbott, B et al. (2009). Observation of a kilogram-scale oscillator near its quantum ground state.

New Journal of Physics, 11(7), p.073032.

Helwer, A. (2018). Retrieved from https://www.microsoft.com/en-

us/research/uploads/prod/2018/05/40655.compressed.pdf

Merali, Z. (2015). Quantum physics: What is really real?. Nature, 521(7552), pp.278-280.

"Quantum Computing and the Entanglement Frontier" John Preskill, CalTech. (2017). .

Research, I. (2017). IBM Q experience. [online] Quantumexperience.ng.bluemix.net. Available

at: https://quantumexperience.ng.bluemix.net/qx/tutorial?sectionId=full-user-

guide&page=introduction [Accessed 20 Nov. 2017].

Simonite, T. (2017). The foundation of the computing industry’s innovation is faltering. What

can replace it?. [online] MIT Technology Review. Available at:

https://www.technologyreview.com/s/601441/moores-law-is-dead-now-what/ [Accessed

19 Nov. 2017].

Singh, C. (2007). Helping Students Learn Quantum Mechanics for Quantum Computing. AIP

Yanofsky, N., & Mannucci, M. (2013). Quantum computing for computer scientists. New York:

Cambridge University Press.

You might also like