You are on page 1of 2

In the world of computation, quantum computing investigates the ideas and uses of

quantum physics. It is a relatively recent topic of study. Physicists like Max


Planck,
Albert Einstein, Niels Bohr, Werner Heisenberg, and Erwin Schrödinger lay the
groundwork for quantum mechanics in the early 20th century, but the idea of quantum
computing
didn't come into existence until the early 1980s.

The history of quantum computing is as follows:

1. Origins of Quantum Mechanics: In the early 1900s, researchers created quantum


mechanics to describe how atomic and subatomic particles behave. Superposition,
in which particles can exist in numerous states at once, and entanglement, in which
the states of particles become connected, were important ideas.

2.The Development of Quantum Computing: In 1981, physicist Richard Feynman put up


the notion of simulating quantum processes using quantum systems.
He believed that computers could accomplish calculations that were impossible for
classical computers by using quantum physics.

3. Quantum bits (also known as qubits): In 1985, British physicist David Deutsch
proposed the idea of qubits as the basic unit of information in quantum computing.
Qubits can exist in a superposition of both states concurrently, unlike
conventional bits, which can only represent either 0 or 1.
This enables parallel processing and exponential computing capacity.

4.Quantum algorithms: These started to appear in the 1990s. Peter Shor's quantum
algorithm, which factors big numbers exponentially more quickly than traditional
algorithms,
was the most notable algorithm. This approach sparked a rise of interest in the
topic by demonstrating how quantum computing may be used to crack encryption
systems.

5. Experimental Milestones: The experimental realisation of quantum computing


systems made considerable strides in the late 1990s and early 2000s.
Researchers began building simple qubit systems on a variety of physical building
blocks, including as atoms, ions, and superconducting circuits.

6. Quantum Error Correction: Due to their inherent fragility, quantum systems are
very susceptible to noise and errors.
The idea of quantum error correction, which enables the identification and
correction of faults in quantum calculations, was independently put forth in 1995
by Shor and Andrew Steane.
For the development of trustworthy quantum computing, this area of research became
crucial.

7. Development of More Advanced and Scalable Qubit Platforms: Researchers made


tremendous progress in creating more sophisticated and scalable qubit platforms
throughout the 2000s and 2010s.
As promising technologies, trapped ion qubits and superconducting qubits have
developed.

8. Quantum Supremacy: In 2019, Google's research team announced that they had
achieved quantum supremacy, which is the point at which a quantum computer
outperforms any classical computer at a given task.
They showed that a problem that would have taken even the most potent classical
supercomputers thousands of years to complete could be solved by their quantum
processor in just 200 seconds.
9. Continued Research and Applications: Work is still being done to build fault-
tolerant, error-corrected quantum computing systems as well as to enhance qubit
counts and qubit coherence times.
Numerous fields, including cryptography, optimisation, drug discovery, material
science, and machine learning, are investigating practical applications.

Although the science of quantum computing is still in its infancy, the technology
has immense potential to revolutionise processing power and solve challenging
issues that are beyond the capacity of traditional computers.
In the coming years, maximising the promise of quantum computing will depend
heavily on ongoing research and developments in quantum hardware and algorithms.

You might also like