You are on page 1of 22

Running Head: QUANTUM COMPUTING AND CRYPTOGRAPHY TODAY 1

Quantum Computing and Cryptography Today:

Preparing for a Breakdown

Travis L. Swaim

University of Maryland University College


QUANTUM COMPUTING AND CRYPTOGRAPHY TODAY 2

Abstract

With the advent of quantum computing on the horizon computer security organizations are

stepping up research and development to defend against a new breed of computer power.

Quantum computers pose a very real threat to the global information technology infrastructure of

today. Many security implementations in use are based on the difficulty for modern-day

computers to perform large integer factorization. Utilizing a specialized algorithm such as

mathematician Peter Shor’s, a quantum computer can compute large integer factoring in

polynomial time versus classical computing’s sub-exponential time. This theoretical exponential

increase in computing speed has prompted computer security experts around the world to begin

preparing by devising new and improved cryptography methods. If the proper measures are not

in place by the time full-scale quantum computers are being produced, the world’s governments

and major enterprises could suffer from security breaches and the loss of massive amounts of

encrypted data. Cryptographers are discussing alternatives to today’s methods and have agreed

that there are four major candidates that would provide immunity from a quantum computer

attack. The four possible replacement methods include: error-correcting codes, hash-functions,

lattice cryptography systems, and multivariate public-key cryptography systems.


QUANTUM COMPUTING AND CRYPTOGRAPHY TODAY 3

Quantum Computing and Cryptography Today: Preparing for a Breakdown

Introduction

Computer security and protecting valuable information has long been a delicate subject in

the world of information technology. With the development of the Internet, companies had to

ensure that customer data as well as their internal private data was protected from outside

intrusions; the secure socket layer (SSL) protocol was the first step toward allowing for the

secure transmission of information from client to server and vice versa. Data encryption became

a requirement for day-to-day operations of any organization connected to the Internet and thus

the world of big-business cryptography exploded.

Cryptography, while not a new practice, grew exponentially in popularity as more

computers came “online” and companies began to realize competing in the global market of the

Internet was becoming a necessity. Encryption of data for many IT systems today relies on

public-key cryptography. The concept of public-key cryptography was introduced by Whitfield

Diffie and Martin Hellman in 1976 (Diffie & Hellman, 1976). This new method of encryption

had two main purposes, encryption and digital signatures. It entails that each person (or

communicating system) gets a pair of keys, one was dubbed the public key and the other was

named the private key. The public key is shared between the two parties and is used for

identifying the end-user while the private key remains a secret and is never transmitted.

Encrypted information is sent using the public key to identify the source but only a receiver that

possesses the private key is able to decode the message. Unfortunately, the private key, while

kept a secret from prying eyes, is linked to the public key through a mathematical algorithm

therefore presenting a weakness to this system. This “weakness” in the system became less and

less of an issue due to the complexity of being able to solve the algorithm for common computer
QUANTUM COMPUTING AND CRYPTOGRAPHY TODAY 4

systems. Even when utilizing a “brute force” technique (a.k.a. systematically trying every

combination of letters, numbers, and symbols), a strongly ciphered public-key encryption system

remains untouchable by even today’s most powerful computer systems. The complexity of

arriving at a solution arises due to the inability for today’s processors to factor increasingly

larger and larger integers, the basis for the strongest breeds of encryption to date. To put this

into a clearer context, researchers were able to successfully factor a 768-bit RSA modulus cipher

using the number field sieve factoring method which would have taken a single-core 2.2 GHz

AMD Opteron processor with 2 GB of RAM over 1500 years to process (Kleinjung, et al., 2010).

It is this length of time that makes breaking today’s most heavily encrypted data such a daunting

and near impossible endeavor.

Quantum computing presents the first serious risk associated with actually providing a

means to break the most sophisticated of encryption systems in use today. Utilizing atoms as

pseudo bits of digital information, binary processing can be achieved in a much more

sophisticated fashion than is currently possible. Quantum bits, or “qubits”, can take on the value

of 1 or 0 as the traditional digital bit does; the complexity arises as these qubits can also take on

the value for everything between 0 and 1 at the same time. This ability allows for a single qubit

to represent every possible value between 0 and 1 simultaneously, thus permitting computations

to be calculated in parallel on every one of these values as well.

In this paper, we will explore the history of quantum computing theory and the

development of varying techniques researchers are using to accomplish building a fully

functional system today. We’ll also explore the practical applications for quantum computers

and how they could affect current cryptographic systems as well as how they are shaping the

development of new systems. Cryptography will remain a central focal point throughout delving
QUANTUM COMPUTING AND CRYPTOGRAPHY TODAY 5

into varying theories and methods cryptography organizations are testing and implementing to

defend against massive computing power at an atomic scale. Current methods of cryptography

are in danger due to this unique type of threat and security organizations are realizing that they

can no longer sit idle waiting for the first quantum computer to be produced before acting.

Preparation must begin now if the world is to have any hope of converting over from the

encryption techniques that we rely on every day to new and improved quantum computer proof

methods of tomorrow.
QUANTUM COMPUTING AND CRYPTOGRAPHY TODAY 6

The History of Quantum Computing

The idea of a quantum computer began in the early 1980s and was conceived by Paul

Benioff, Charles Bennett, David Deutsch, Richard Feynman, and Yuri Manin (Bacon & Leung,

2007). The original idea of such a system was purely theoretical and scientists were basing these

ideas on years of research into quantum theory and information science. Scientists speculated

that if technology was to continue following Moore’s Law (the observation that steady

technological improvements in miniaturization leads to a doubling of the density of transistors on

new integrated circuits every 18 months (Moore's law)) that eventually the size of circuitry on

chips would be reduced to the size of no more than a few atoms. At this size, the workings of an

integrated circuit would be governed by the laws of quantum mechanics and thus the researchers

began to question if a new kind of computer could be created based around the study of quantum

physics.

In 1982, Richard Feynman constructed an abstract model showing how a quantum system

would function and be used for doing computational work. Basically, a classical bit in

computing is used to represent two different states of an information processing machine. One

could refer to a bit as a type of light switch; rather, a bit can be either “on” or “off”. When a bit

is “off” it is said to have the value of 0 whereas when it is “on” it is said to have the value of 1.

Quantum computer bits, or “qubits”, can be a 0 or a 1 at the same time. A particle found to be in

this quantum state is said to be in “superposition”; however, the particle will take on a single

location once someone or something observes the particle. The speed advantages of quantum

computers over classical computers were not realized until the early 1990s when David Deutsch

and Richard Jozsa demonstrated that given a quantum computer utilizing the function
QUANTUM COMPUTING AND CRYPTOGRAPHY TODAY 7

f : {0,1}n  {0,1}, we can be assured that the function is either constant (0 or 1), or balanced

(returns 1 for half of the results and 0 for the other half) (Deutsch & Jozsa, 1992). Deutsch and

Jozsa’s algorithm provided a man by the name of Peter Shor the basis for constructing one of the

most well-known and important quantum algorithms the computing world had ever seen.

In 1994, mathematician Peter Shor formulated a quantum algorithm designed to be used

on a quantum computer to process integer factorization computations. Integer factorization at its

basis can be taken as “Given an integer N find its prime factors”. Shor realized that utilizing his

theory a quantum computer could “efficiently factor and compute” the solution to large integer

factoring problems (Bacon & Leung, 2007). Shor’s quantum theory works in polynomial-time

unlike the classical factoring algorithm, the general number field sieve, which factors an L-bit

number N in time O(exp(cL1/3 log2/3 L)). The algorithm works by determining the period of the

function f(x) = ax mod N where a is a random chosen number by the quantum computer with no

factors in common with N. After obtaining this period, using number-theoretic techniques we

can now factor N with a high probability of success (Shor, 1997). Using this method for

factoring numbers results in a significant exponential difference in time versus the general

number field sieve method with time being O((log N)3). These extreme differences in time

between classical computers and Shor’s algorithm on a quantum computer can be seen in Figure

1. The NFS curve on the left is data gathered from a previous world record, factoring a 530-bit

number in one month on 104 PCs and workstations in 2003 (Van Meter, Itoh, & Ladd, 2005).

The right curve is speculative based on 1,000 times as much computing power of these classical

computers which works out to be around 100 PCs in 2018 based on Moore’s law. We can see

how Shor’s algorithm is much more efficient in factoring than anything that’s currently possible

or will be possible in the next decade.


QUANTUM COMPUTING AND CRYPTOGRAPHY TODAY 8

Figure 1. Scaling of number field sieve (NFS) on classical computers and Shor’s algorithm for factoring on a
quantum computer, using Beckman-Chari-Devabhaktuni-Preskill modular exponentiation with various clock rates.
Both horizontal and vertical axes are log scale. The horizontal axis is the size of the number being factored (Van
Meter, Itoh, & Ladd, 2005).

In 2001, Shor’s algorithm was put to the test by IBM researchers using room temperature

liquid-state nuclear magnetic resonance techniques to manipulate nuclei in a molecule as

quantum bits (Vandersypen, Steffen, Breyta, Yannoni, Sherwood, & Chuang, 2001). As

insignificant as it sounds, the researchers, utilizing a very primitive quantum computer, were able

to apply Shor’s algorithm to successfully factor 15 giving the results of 3 and 5. They noted that

their experiment could be scaled to a much larger system with more than the 7 qubits they

utilized but that it was intended to simply demonstrate the techniques for the control and

modeling of quantum computers for the future.

Quantum cryptography has been at the forefront of purposes for developing quantum

computers since the early 1980s. Due to the way the qubits behave when observed, it opened up

the possibility of creating a new form of quantum communication between two parties. Where
QUANTUM COMPUTING AND CRYPTOGRAPHY TODAY 9

before, transmission of messages relied on the receiver having an encryption key to decode an

encoded message, researchers were able to utilize photons to send a message and detect whether

the message had been viewed along the way. While this method does not prevent an

eavesdropper from reading the message, it created a way for both the sender and receiver to

know if the message had been intercepted.

A cryptographic application of a quantum system was one of the earliest ideas involved

with quantum computation and can be accredited to Stephen Wiesner in the 1960s. Wiesner

developed a theory that was meant to prevent counterfeiting of money using the laws of physics

as a basis for protection (Bacon & Leung, 2007). His method relied on information that is

encoded in quantum states thereby being able to prevent any outside party from accessing said

information without disturbing the state. This property of quantum information has given birth

to a new method of information exchange and other companies are investing in it to develop new

products giving users the utmost security of knowing if their critical data has been intercepted or

viewed by an unintended audience outside the exchange.

In 2002 and 2003, a Swiss company called id Quantique and an American company

called MagiQ Technologies, both developed commercial communication products leveraging

this technology for message transmission and receipt (Bacon & Leung, 2007). These two

companies are noted as marketing the very first quantum key distribution systems. This could be

the preferred method for secure communication of the future instead of relying on a receiver held

private key utilized in systems based on the famous RSA crypto architecture for example.

Larger organizations are also starting to invest in quantum technologies, such as Hewlett-

Packard, Microsoft, IBM, Lucent, Toshiba and NEC; each have active research programs
QUANTUM COMPUTING AND CRYPTOGRAPHY TODAY 10

exploring how quantum cryptography can be leveraged into their future business models (Bacon

& Leung, 2007).

Regarding the aforementioned past research on quantum systems, “the short-range

business concerns of these developments remains unclear at the moment, but experience has

shown that the industry needs many years to replace legacy systems – you cannot easily change

ATMs and mainframe applications,” says Andrea Simmons of Computer Weekly (Simmons,

2009). At this point, it’s not really an “if quantum computers come to be” it’s “when they come

to be”, will we be prepared? Research is happening right now and scientists are getting closer to

understanding the hardest questions plaguing quantum computers; in the next section, we’ll take

a look at this research and some of the concerns and issues scientists are dealing with today.
QUANTUM COMPUTING AND CRYPTOGRAPHY TODAY 11

Quantum Computing Today

Many advances in research being conducted on the creation of quantum computers have

lead industry and educational sector experts to believe that the technology could be just around

the corner. “Quantum computational devices with calculating power greater than any of today’s

conventional computers could be just a decade away, says Bristol University physicist and

electrical engineer Mark Thompson,” (Docksai, 2011). Thompson aided in the development of

two quantum photonic computer chips which process photons to provide what IBM

accomplished in 2001 with factoring the integer 15. This shows how technology advancements

over the past ten years has allowed past research findings to be minimized into a form factor that

can be utilized inside a much more acceptable size for consumers.

Organizations delving into quantum computer development are faced with a number of

different methods for developing quantum systems. These varying methods each have their

advantages and disadvantages regarding factors such as scalability, longevity, and accuracy.

Current forerunners in the area include ion-trap quantum computers and NMR (nuclear magnetic

resonance) quantum computers. First, we’ll look at the ion-trap method and the current research

being done to overcome the complexities of scaling this method.

Ion-trap quantum computation is currently being researched at the National Institute of

Standards and Technology. An ion-trap can be described as using a line of N trapped ions where

each ion has two stable or metastable states; there are also N laser beam pairs each interacting

with one of the ions (a qubit) (Steane, 1996). It is these laser beams that essentially program the

qubits by providing a pulsing form of a quantum logic gate. This is very similar to how a

transistor works by switching a classical bit from 1 to 0 and vice versa. Scientists are looking

into a “quantum charge-coupled device” (QCCD) architecture which is essentially a large


QUANTUM COMPUTING AND CRYPTOGRAPHY TODAY 12

collection of these ion-traps. The QCCD method was proposed as a possible solution to the

limitations researchers faced on scaling a single ion-trap to be able to confine a large number of

ions. Incredible complexities arise technologically and “scaling arguments suggest that this

scheme is limited to computations on tens of ions (Kielpinski, Monroe, & Wineland, 2002). The

advantage of the QCCD method is that by altering voltages of the ion-traps researchers can

confine a set number of ions in each trap or even transport ions to another trap (Kielpinski,

Monroe, & Wineland, 2002). This allows for the scaling of a multiple ion-trap quantum

computer to be achieved much easier. QCCD ion-trap quantum computers still have a long way

to go to be a feasible and scalable method for developing these machines. If we look at the size

of the very first computers taking up multiple rooms and compare this with where these quantum

computers are, we can see that in a few decades these machines will follow Moore’s law and

decrease substantially in size while increasing in power. “Build the first one and in 25 years, they

will be 25% of the size. I bet that, after the first quantum computer, the cost of one 10 years later

will be significantly reduced,” (Docksai, 2011).

The United States Government is funding the research at NIST on QCCD ion-trap

quantum computers because they realize the implications that could arise should a country gain

quantum computational power before them. The country that first develops a full-scale quantum

computer will have the power to crumble current encryption methods and expose an

unbelievable amount of data in a very short amount of time. Achieving the technology to build a

fully functional quantum computer “is among the great technological races of the 21st century, a

race whose results may profoundly alter the manner in which we compute (Bacon & Leung,

2007). In the book, The Quest for the Quantum Computer by Julian Brown and David Deutsch,

the authors note that “if anyone could build a full-scale quantum computer, it’s possible that he
QUANTUM COMPUTING AND CRYPTOGRAPHY TODAY 13

or she would be able to access everything from your bank account to the Pentagon’s most secret

files. It’s no surprise, then, that significant funds backing this line of research have come from

such organizations as the U.S. Department of Defense, the National Security Agency, NATO,

and the European Union,” (Brown & Deutsch, 2000).

Another method being explored is based on nuclear magnetic resonance (NMR)

spectroscopy. The aforementioned research done at IBM factoring 15 was done using a NMR

based quantum computer. NMR computers rely on the two spin states of a spin-1/2 atomic

nucleus in a magnetic field; different atoms in a molecule can be singled out and thus a molecule

can actually be used as a quantum computer (Jones, 1998). Each spin-1/2 nucleus provides a

single qubit that is manipulated by multiple logic gates provided by radio frequency fields. To

increase the capabilities of this quantum computer the scientists need only make these logic gates

more complex. As their complexity increases, the qubits can affect other qubits in the computer

and thus allows them to work together as a single computational system. The advantages of this

method arise when looking at the coherence of the qubits in the system. Coherence describes the

ability for the qubits to retain their states despite “noise” from the external environment. “Slow

decoherence is one of the primary attractions of an NMR quantum computer,” (Gershenfeld &

Chuang, 1997).

There are a plethora of different methods, other than the ones mentioned here, that

scientists are working on to bring quantum computing to the next level. As research continues in

these areas, we inch closer to the day that a fully functional quantum processing machine is in

our grasp. The latest research has yielded better error correction with the qubit states, faster

quantum algorithms, and led to the evolution of decoherence and entanglement in quantum

computing.
QUANTUM COMPUTING AND CRYPTOGRAPHY TODAY 14

Computer Security Organizations and Quantum Computing

Cryptography specialists are beginning to take notice that quantum computing may not be

too far away. Many of the IT industry’s top cryptography experts have predicted that a full-scale

quantum computer could manifest in as little as 10 years (Heger, 2009). The PQCrypto (Post-

Quantum Cryptography) conference was created in 2006 to address and discuss the dangers the

world of computer security may face with the successful creation of quantum computers.

Researchers from all over the world flocked to meet in Leuven, Belgium to begin discussions on

possible alternatives to the most widely used encryption systems. The two most popular

encryption methods in use today are RSA and elliptic-curve cryptography (ECC) (Heger, 2009).

These two methods both rely on public-key architecture and digital signatures to provide a means

to communicate securely between two parties in digital communication. As we’ve covered

earlier in this paper, the entire public-key architecture is at risk of being obliterated by the

computing power of a single, albeit low-powered, quantum computer. To gain a better

understanding of where experts see cryptography technology heading, let’s take a look at a few

of the quantum-proof methods that have been discussed at past PQCrypto conferences.

The idea of Lamport signatures was first mentioned in a paper by Leslie Lamport from

SRI International on October 18, 1979 (Lamport, 1979) where he discusses the use of one-way

hash functions to generate signatures. The idea behind this type of encryption is that a function

can be created to be irreversible in nature. The Lamport crypto-scheme is a one-time signature

scheme, therefore each time a signed message is generated a new signature must also be created.

Utilizing a hash function to implement this type of crypto system, the text of a message is

efficiently reduced to a much shorter string of bits which is the message signature (Heger, 2009).

Researchers Ray Perlner and David Cooper (2009) from the National Institute of Standards and
QUANTUM COMPUTING AND CRYPTOGRAPHY TODAY 15

Technology explain the basics to the Lamport signature in their paper on quantum resistant

public-key cryptography:

In the simplest variant of Lamport signatures, the signer generates two high-entropy

secrets, S 0,k and S 1,k , for each bit location, k, in the message digest that will be used for

signatures. These secrets (2n secrets are required if the digest is n bits long) comprise the

private key. The public key consists of the images of the secrets under f, i.e., f(S 0,k ) and

f(S 1,k ), concatenated together in a prescribed order (lexicographically by subscript for

example). In order to sign a message, the signer reveals half of the secrets, chosen as

follows: if bit k is a zero, the secret S 0,k is revealed, and if it is one, S 1,k is revealed. The

revealed secrets, concatenated together, comprise the signature. (Perlner & Cooper, 2009)

The reason the Lamport signature is a one-time used architecture is due to the signing method

actually revealing a small amount of information about the private key. This leak of information

is not, however, enough for an attacker to build and sign a forged message but subsequent

messages must be accompanied by a newly generated key to remain secure. The performance of

a system like the Lamport signature is purely dependent on the one-way function the signer

chooses to implement. The original implementation was greatly improved by others as the

original requirement of running numerous hash functions to generate signatures grew

exponentially (Merkle, 1988).

Another possible candidate to replace public-key systems comes in the form of

multivariate public-key cryptosystems (MPKCs) and bases its strength on multivariable

nonlinear equations (Heger, 2009). Quantum computers share a weakness with their classical

brethren when trying to solve problems said to be “NP-complete”. Jintai Ding (2008), a top

researcher in MPKCs, notes that “It’s difficult to explain what NP-complete means, but it just
QUANTUM COMPUTING AND CRYPTOGRAPHY TODAY 16

means very, very difficult. It is exponential, meaning that as the size of a problem increases, the

time to solve it increases exponentially. And quantum computers have not yet been able to

defeat NP-complete types of problems,” (IEEE Spectrum, 2008). A computing machine is only

as powerful as the mathematics behind said machine and when the mathematics has not yet been

developed to solve a problem with efficiency, the problem is said to be NP-hard. Whereas

traditional RSA type cryptosystems rely on mathematics developed in the 17th and 18th centuries

(number theory), MPKCs use 20th century algebraic geometry for their basis (Ding & Schmidt,

2006). Ding and Schmidt (2006) state on MKPCs that, “the method relies on the proven theorem

that solving a set of multivariable polynomial equations over a finite field is in general an NP-

hard problem,” (Ding & Schmidt, 2006).

PQCrypto attendees also discussed lattice based cryptography systems which researchers

believe can be implemented in a way that makes for solving the algorithm an NP-complete

problem. “An n-dimensional lattice is the set of vectors that can be expressed as the sum of

integer multiples of a specific set of n vectors, collectively called the basis of the lattice,”

(Perlner & Cooper, 2009). The NP-complete issue for cracking this type of cryptography arises

when increasing the dimensions of the lattices and trying to solve the shortest vector problem

(Ajtai, 1998) as well as the closest vector problem (van Emde Boas, 1981). Both problems

revolve around the difficulty of solving for the shortest vector to a random non-lattice vector.

The fourth and final candidate being researched is encryption schemes based on the use

of error-correcting codes. Basically, the idea behind this type of encryption is that the sender of

the message encrypts the message with noise, or random additional information, therefore

obfuscating the original message. Only the receiver has the ability to “sift” through the

information to deduce the true content of the message. The first error-correcting encryption
QUANTUM COMPUTING AND CRYPTOGRAPHY TODAY 17

scheme was devised by Robert J. McEliece about one year after the RSA encryption technique

was proposed (Joye, 2009). Many refer to error-correcting encryption schemes as “code-based

cryptography” and researchers Bernstein, Lange, and Peters (2011) state that “code-based

cryptography has lately received a lot of attention because it is a good candidate for public-key

cryptography that remains secure against attacks by a quantum computer,” (Bernstein, Lange, &

Peters, 2011). A major drawback in utilizing this scheme is that the public key is too large due

to the excess data for efficient communication; however, research continues in an attempt to

minimize the required information for public key creation to make this technique a viable

solution.

Of these four completely different techniques, is there a single encryption scheme that

will provide security where the aging encryption schemes fail? Jintai Ding (2008) states on the

matter, “no, I cannot really specify one area. These four systems are all very different and each

has its own advantages and disadvantages,” (IEEE Spectrum, 2008). Researchers from all over

the world continue to meet at the PQCrypto conference, held every few years, to discuss the

latest findings on each of the aforementioned methods as well as new potential candidates.

While they may disagree on when quantum computing is coming, they all seem to agree that

there is a sense of urgency to develop encryption schemes that will be ready for when it does:

Quantum cryptography may be five years ahead; quantum computing may be 15 years

away. Progress in building quantum computers tends to develop in steps as researchers

find new methods, so it is slow, but the theory is solid. We should be starting now to

evaluate the impact. (Simmons, 2009)


QUANTUM COMPUTING AND CRYPTOGRAPHY TODAY 18

Conclusion

With the widespread use of RSA and DES cryptography systems across the world, the

fear of a single computer being able to knock them out in minutes is growing. Regardless if

quantum computing is a few years or decades away, researchers caution that we need to be

prepared. The PQCrypto conference was formed due to the growing need to address this very

issue. The sheer potential power that a single quantum computer could possess in the size of a

molecule is staggering, to say the least. With computing moving into the realm of quantum

physics, a new breed of computer scientists are being born; individuals who must fully

understand the difference between a classical binary bit and the all-encompassing qubit. The

algorithms being created to run on quantum computers blends the worlds of quantum mechanics,

quantum theory, mathematics, and computer science together into one wide-spread area of study.

Computer security of the future may very well rely on these scientists and researchers.

Quantum computing is edging closer every day with many countries participating in the

“race” for the ultimate computing power. Governments of the world realize that it is not a

technology to take lightly and they cannot afford to fall behind on their research into this area.

As we discussed earlier, the U.S. Government has invested into the development of quantum

computing and hopes that the researchers are close to accomplishing their goals. Meanwhile,

consumers and organizations sit readily by unknowingly perpetuating the problem of

implementing and using 30-40 year old encryption techniques based on severely outdated

mathematic algorithms. Large scale IT projects are both time consuming and costly to any

profit-seeking business and to expect these behemoths to switch over to a new standard the same

day quantum computers begin to hit the market will be disastrous. These massive entities need

to at least start having discussions or following the research into quantum-proof cryptography so
QUANTUM COMPUTING AND CRYPTOGRAPHY TODAY 19

when the time comes they are not caught off-guard. Michelle Mosca (2010), deputy director of

the Institute for Quantum Computing at the University of Waterloo, comments on when quantum

computing is coming and the risks involved, "We don't know. Is that an acceptable risk? I don't

think so. So we need to start figuring out what alternatives to deploy since it takes many years to

change the infrastructure," (Wood, 2010).

As this paper has outlined, cryptography organizations are and will continue to explore

alternatives to the quantum computer risks. Because of the uncertainty of the timeline

cryptographers are working with, they realize that it’s not an acceptable risk to continue ignoring

the possibilities. Even if quantum computers are 20 years off into the future, that is a relatively

short amount of time when working with such complex systems that so much of our world’s

infrastructure relies upon. Cryptography is one area that must remain ahead of technology,

which is why it is such a difficult field of study; nonetheless, it is here to stay and personal,

corporate, and national secrets depend on its success.


QUANTUM COMPUTING AND CRYPTOGRAPHY TODAY 20

References

Ajtai, M. (1998). The shortest vector problem in L2 is NP-hard randomized reductions. 30th
ACM Symposium on Theory of Computing (pp. 10-19). New York: ACM.

Bacon, D., & Leung, D. (2007, September). Toward a World with Quantum Computers.
Communications of the ACM, 50(9), pp. 55-59.

Bernstein, D. J., Lange, T., & Peters, C. (2011). Wild McEliece. Lecture Notes in Computer
Science, 6544/2011, 143-158.

Brown, J. R., & Deutsch, D. (2000). The quest for the quantum computer. New York, NY:
Touchstone (Simon & Schuster, Inc.).

Deutsch, D., & Jozsa, R. (1992). Rapid Solution of Problems by Quantum Computation.
Proceedings of the Royal Society of London Series A - Mathematical Physical and
Engineering Sciences (pp. 553-558). London: Royal Society of London.

Diffie, W., & Hellman, M. E. (1976). New directions in cryptography. IEEE Transactions on
Information Theory, 22(6), 644-654.

Ding, J., & Schmidt, D. (2006). Multivariable public key cryptosystems. Contemporary
Mathematics(419), 79-94.

Docksai, R. (2011). Computers making the Quantum Leap. Futurist, 45(3), pp. 10-11.

Gershenfeld, N. A., & Chuang, I. L. (1997, January 17). Bulk Spin-Resonance Quantum
Computation. Science, 275(5298), 350-356.

Heger, M. (2009, January ). Cryptographers Take On Quantum Computers. Retrieved July 24,
2011, from IEEE Spectrum: http://spectrum.ieee.org/computing/software/cryptographers-
take-on-quantum-computers

IEEE Spectrum. (2008, November). Q&A with post-quantum computing cryptography


researcher Jintai Ding. Retrieved August 8, 2011, from IEEE Spectrum:
http://spectrum.ieee.org/computing/networks/qa-with-postquantum-computing-
cryptography-researcher-jintai-ding/0

Jones, J. (1998). Fast searches with nuclear magnetic resonance computers. Science, 280(5361),
229.

Joye, M. (2009). Identity-based cryptography. Amsterdam: IOS Press.


QUANTUM COMPUTING AND CRYPTOGRAPHY TODAY 21

Kielpinski, D., Monroe, C., & Wineland, D. J. (2002, June 13). Architecture for a large-scale
ion-trap quantum computer. Nature, 417(6890), 709-711.

Kleinjung, T., Aoki, K., Franke, J., Lenstra, A. K., Thomé, E., Bos, J. W., et al. (2010, August).
Factorization of a 768-bit RSA modulus. CRYPTO'10 Proceedings of the 30th annual
conference on Advances in cryptology (pp. 333-350). Berlin, Heidelberg: Springer-
Verlag.

Lamport, L. (1979, October 18). Constructing digital signatures from a one-way function. In
Technical Report CSL-98. Menlo Park, CA: SRI International.

Merkle, R. C. (1988). A digital signature based on a convential encryption function. CRYPTO


'87 A Conference on the Theory and Applications of Cryptographic Techniques on
Advances in Cryptology (pp. 369-378). London, UK: Springer-Verlag.

Moore's law. (n.d.). The American Heritage® Science Dictionary. Retrieved August 5, 2011,
from Dictionary.com: http://dictionary.reference.com/browse/moore%27s%20law

Perlner, R. A., & Cooper, D. A. (2009). Quantum resistant public key cryptography: a survey.
IDtrust '09 Proceedings of the 8th Symposium on Identity and Trust on the Internet (pp.
85-93). New York, NY: Association for Computing Machinery.

Shor, P. (1997). Polynomial-time algorithms for prime factorization and discrete logarithms on a
quantum. SIAM Journal on Computing, 26, 1484-1509.

Simmons, A. (2009, May 19). Quantum implications for IT security. Computer Weekly, pp. 14-
15.

Steane, A. (1996). The ion trap quantum information processor. Applied Physics B: Lasers and
Optics, 64(6), 623-643.

van Emde Boas, P. (1981). Another NP-complete problem and the complexity of computing short
vectors in a lattice. Netherlands: University of Amsterdam, Department of Mathematics.

Van Meter, R., Itoh, K. M., & Ladd, T. D. (2005). Architecture-Dependent Execution Time of
Shor's Algorithm. Retrieved from EBSCOhost.

Vandersypen, L., Steffen, M., Breyta, G., Yannoni, C., Sherwood, M., & Chuang, I. (2001,
December). Experimental realization of Shor's quantum factoring algorithm using nuclear
magnetic resonance. NATURE, 414(6866), 883-887.

Wood, L. (2010, December 17). The clock is ticking for encryption. Retrieved August 8, 2011,
from Computerworld:
http://www.computerworld.com/s/article/9201281/The_clock_is_ticking_on_encryption
QUANTUM COMPUTING AND CRYPTOGRAPHY TODAY 22

You might also like