DEV Community

Cover image for Bytesize: Quantum computing and the future of programming
Ryan Thelin for Educative

Posted on • Originally published at educative.io

Bytesize: Quantum computing and the future of programming

Bytesize is our 5-minute infusion of developer history, speculation, or otherwise fun knowledge that you can enjoy during a work or study break.

When you set off to become a developer, you probably dreamed of working on self-driving cars, artificial intelligence, world-changing apps, or any number of other amazing products. However, there’s a lot of learning and beginner developer work you need to do before you can get there. This can be draining and can make you lose sight of what got you excited to be a developer in the first place.

When knee-deep in a tough concept or a product function that’s fighting back, Bytesize is here every week to give you 5-minutes to shake it off and let your imagination fly once more.

Never miss Educative content again

Sign up for our free, bi-monthly newsletter to receive the latest in tech news and trends, right in your inbox.

Learn more about our Newsletter

What is Quantum Computing?

Quantum Computing is a novel type of computer structure using quantum qubits instead of the traditional binary bit found in modern transistors. Qubits are unique because they can be set to 0 and 1 as well as be suspended in a quantum superposition between 0 and 1. The qubit is necessarily at a halfway point, but rather it's unknown whether the particle is clear or set.

The superposition state allows qubits to have 3 possible settings rather than the traditional 0 or 1. This changes the foundational structure of programs from a series of "yes" or "no" to one that offers an "I don't know" option. Qubits, therefore, allow computers to solve certain problems at a staggeringly fast rate compared to traditional computers, especially when dealing with analyzing large data sets or completing complex mathematical operations.

While the quantum computer excels at certain tasks, it's not a strict upgrade over standard hardware. In fact, researchers have found that quantum computers are as fast or slower to complete many day-to-day tasks that we'd consider basic. Considering the high cost to build these machines and the energy required to keep a qubit in its superposition, it's unlikely we'll soon be walking around with Quantum iPhones or laptops for use on every task.

Instead, the relationship between current computers and quantum computers will likely be similar to the relationship between GPU and CPU; each excels at specific tasks, and modern machines make use of when their ideal use case arises.

The two current ideas to allow for everyday use of quantum computing are:

  • Quantum Processing Unit (QPU): Like a CPU, the QPU would be a modularized piece of hardware that you could install in a desktop or laptop computer. The operating system would then relegate high volume analytical tasks to the QPU, ensuring it's always used for its intended use. This idea is a stretch dream until scientists can discover a superconductor that's effective enough to sustain sufficient energy for the qubit to be in superposition whenever needed.
  • Quantum Cloud Systems: A centralized, off-site quantum computer that other computers can use via the cloud. The system would theoretically work like current cloud storage systems where a third-party owns the computer and users pay based on the amount they use the system. Quantum Cloud systems seem to be a more feasible solution for the technology's current state and could start seeing use as early as the next 5 years.

What fields will quantum computing affect most?

So quantum computing is on the horizon, but how will it affect programmers when it becomes a reality?

Cybersecurity

Cybersecurity will need to be heavily revamped to maintain effectiveness. Many modern encryption systems use RSA encryption, which relies on current computers being unable to find all the prime numbers in a 500+ digit number.

While it would take several years for current computer systems to crack encryptions like this, quantum computers could do it in less than an hour. That means the cybersecurity community will need to come up with a new encryption algorithm that is uncrackable by both current and quantum computers.

On the other hand, quantum computers allow for new cryptographic tools like quantum cryptography. This theoretical system stores an encryption key in a quantum particle that enters a quantum state and cannot be accessed for a period of 30-60 years when it returns to a normal state. Any attempt to view the key would change its state due to the wave function collapse property, which says that quantum particles in superposition must change state when observed.

In short, while cybersecurity experts will lose a familiar tool, they will gain a new powerful form of cryptography that is theoretically impenetrable.

Machine learning and AI

The other biggest change will be with big data analysis. Machine learning is being used in a myriad of fields from financial prediction to robotics to weather forecasting. Each of these fields uses machine learning algorithms to analyze data as it is made available and make predictions as close to real-time as possible.

However, current hardware bottlenecks these technologies. Current computers are relatively slow at analyzing large sets of mixed data. There is therefore a floor for how close to live updates these systems can get.

Quantum computing excels at handling high volumes of data with staggering efficiency, meaning it could open the door to near-instant data analysis.

Financial predictions could more quickly react to changes and can process more variables in the market. AI and robots could immediately identify, evaluate, and respond to new stimuli such as changes in the environment. Weather forecasts could evaluate hundreds of live feeds of meteorological data and synthesize readings like air pressure, humidity, wind strength, and more to create more accurate and reactive predictions.

Overall, quantum computing will affect anything that relies on big data analysis.

Continue reading about these fields

Top comments (0)