DEV Community

Cover image for Quantum Computing: The Next Big Thing?
Shrutik
Shrutik

Posted on

Quantum Computing: The Next Big Thing?

Prologue:
Quantum mechanics is the physics of the very small. It explains and predicts the behavior of atoms and molecules in a way that redefines our understanding of nature. It is the most precise description that we have of the world, and yet, it predicts surprising, often counter-intuitive behaviors.

Quantum computers are also expected to challenge current cryptography methods and introduce new possibilities for completely private communication.
Quantum computing is a rapidly emerging technology that harnesses the laws of quantum mechanics to solve problems too complex for classical computers.

Quantum computing will enable businesses to better optimize investment strategies, improve encryption, discover products, and much more.

The ultimate hope is that math operations such as factoring gargantuan numbers, which now would take a computer billions of years to perform, would only take a few days on a quantum computer.

Quantum Computing

The History of Quantum Computing:
The prehistory of quantum computing began early in the 20th century when physicists began to sense they had lost their grip on reality.

Quantum Leaps
1980
Physicist Paul Benioff suggests quantum mechanics could be used for computation.

1981
Nobel-winning physicist Richard Feynman, at Caltech, coins the term quantum computer.

1985
Physicist David Deutsch, at Oxford, maps out how a quantum computer would operate, a blueprint that underpins the nascent industry of today.

1994
Mathematician Peter Shor, at Bell Labs, writes an algorithm that could tap a quantum computer’s power to break widely used forms of encryption.

2004
Barbara Terhal and David DiVincenzo, two physicists working at IBM, develop theoretical proofs showing that quantum computers can solve certain math puzzles faster than classical computers.

2014
Google starts its new quantum hardware lab and hires the professor behind some of the best quantum computer hardware yet to lead the effort.

2014
Google hires the professor behind some of the best quantum computer hardware yet to lead its new quantum hardware lab.

2016
IBM puts some of its prototype quantum processors on the internet for anyone to experiment with, saying programmers need to get ready to write quantum code.

2019
Google’s quantum computer beats a classical supercomputer at a commercially useless task based on Terhal and DiVincenzo’s 2004 proofs, in a feat many call “quantum advantage.”

2020
The University of New South Wales in Australia offers the first undergraduate degree in quantum engineering to train a workforce for the budding industry.

Do they exist?
This technology reached a milestone in 2019, when a quantum computer completed a specific calculation in a sliver of the time a classical supercomputer would have needed to solve the same problem. The feat is considered a proof of principle; the use of this type of quantum computer to solve practical problems is expected to be years away.

A different approach to quantum computing, called quantum annealing, is further along in development but limited to a specific kind of calculation.
In this approach, a quantum computer housed in a cryogenic refrigerator uses thousands of qubits to quickly approximate the best solutions to complex problems.
The approach is limited to mathematical problems called binary optimization problems, which have many variables and possible solutions.
Some companies and agencies have purchased this type of computer or rent time on new models to address problems related to scheduling, design, logistics, and materials discovery.

When will quantum computers be available?
Almost 20 years have passed since the proof-of-principle demonstration of Shor’s algorithm, and scientists continue to face myriad challenges in developing large-scale quantum computers. Skeptics argue that it is too early to get excited—or anxious, depending on your point of view—about quantum computing’s real-world applications.
It’s instructive to recall that the transistor was invented in 1947, yet the first 4-bit processor was not introduced for another 25 years, and it was another 25 years after that before Intel introduced the Pentium Pro chip with millions of transistors.
Hard tech takes time, and quantum is no exception.

Theorists and experimentalists develop strategies to reduce errors, lengthen the time that qubits can stay in quantum states, and increase the system's fault tolerance, preserving its accuracy even in the presence of errors.

Terms:
Qubit:
A qubit, or quantum bit, is the basic unit of information in quantum computing.
It is the quantum equivalent of the classical bit, which can be either 0 or 1. However, a qubit can be in a superposition of both states at the same time. This means that a qubit can represent more information than a classical bit.
Qubits can be made by manipulating atoms, electrically charged atoms called ions, or electrons, or by nanoengineering so-called artificial atoms, such as circuits of superconducting qubits, using a printing method called lithography.

A circuit design for IBM’s five-qubit superconducting quantum computer

Superposition:
A superposition is a mathematical combination of both 0 and 1.

Entanglement:
When two qubits in a superposition are entangled, certain operations on one have instant effects on the other, a process that helps quantum algorithms be more powerful than conventional ones.

Researchers are inventing new designs for qubits and quantum computers and enhancing existing technology. Established and newer strategies will take time to scale up, increase in reliability, and demonstrate their potential.

We may be able to better fight global warming if quantum simulations can tackle materials-science problems, such as finding compounds for more efficient batteries.

Status:
Universities are exposing students sooner to once-feared quantum mechanics courses. Students are also learning through YouTube channels or online courses, and seeking out open-source communities to begin their quantum journeys. And the demand is skyrocketing for quantum-savvy scientists, software developers and even business majors to fill a pipeline of scientific talent.
We can’t keep waiting six or more years for every one of those students to receive a Ph.D., which is the norm in the field right now.

In recent years, Wisconsin and the University of California, Los Angeles, have welcomed inaugural classes of quantum information masters’ degree students into intensive year-long programs. U.C.L.A. ended up bringing in a much larger cohort than the university anticipated, demonstrating student demand.
The University of Pittsburgh has taken a different approach, launching a new undergraduate major combining physics and traditional computer science, answering the need for a four-year program that prepares students for either employment or more education.
In addition, Ohio recently became the first state to add quantum training to its K-12 science curricula.

But is it a good idea to train a new generation of students in a technology that is not fully realized? Or what can be gained by teaching quantum physics to young students?

These are reasonable questions but consider:
Quantum is more than just a technology; it’s a field of study that undergirds chemistry, biology, engineering, and more; quantum education is valuable beyond just computing. And if quantum computing does pan out—which I think it will—then we’ll be far better off if more people understand it.

I hope this post was informative and helpful.
If you have any questions, please feel free to leave a comment below.

Happy Coding 👍🏻!
Thank You

Top comments (0)