markdown guide

Learning Quantum computing is totally doable and even without most of the Maths that people think are necessary. I've been recordin myself while learning here

Hope it's helpful ;)


LOL thanks for this! I didn't know what it is, and kinda getting it.


I'm currently reading Quantum Computing Since Democritus by Scott Aaronson. I really like his writing style and his ability to not dumb down quantum computing, but also make it accessible.


The best I understand it, quantum computing relies on the weird ability of electrons to be "neither here nor there" to create a tri-state bit: 0, 1, or 2. By working in ternary instead of binary, data can be processed at much higher speeds (and stored in less memory space).

The hardware difficulty with quantum computing is that you have to keep things at darn near 0 Kelvin, so the electron's "neither here nor there" state can actually stay locked in place.

At least, that's the "explain like I'm 5" explanation.


Although this "ternary" explanation is simple and seems to provide some intuition about why a quantum computer is faster, it's incorrect. Quantum bits are fundamentally different from binary and ternary bits.

Ternary bits are actually possible in classical computers. Ternary bits are (almost) never used because reliable binary bit hardware was developed first.

Some people have built ternary computers which are super cool though!.

Here's a simple explanation of quantum bits that is totally true, but unfortunately doesn't provide much intuition, or help you understand why quantum is faster than classical: A quantum bit is a pair of real numbers (let's call them a and b) such that a2 + b2 = 1. That's it. The simple and somewhat unsatisfying fact is that normal intuition doesn't really apply to quantum computing.

Here's a fantastic video that explains how a quantum bit leads to quantum speedup, entanglement, and quantum teleportation using the math.

I know less about quantum computer hardware, but your hardware difficulty explanation seems on point.


I certainly had picked up some incorrect information, then. Thanks for the insight!

No worries, and thanks for your original comment. I think it's great to talk about common misunderstandings, that way everyone comes out with a better understanding.


Suppose that there is very very big box of candies that all taste different. Someone asks you to choose your favorite flavor. If you work like a regular computer then you need to try all the flavors one by one before knowing.

But if you work like a quantum computer then it's really weird but really convenient. You will try to put all the candies in your mouth at once but when you do then you realize that there is only one candy that you have been putting in your mouth and that it is the best one. All of that in one go.


Also, this video is just a bit math heavy, but I managed to get through it (even though I didn't get all the details):


If you learn best by doing IBM has an interface to an actual real-life quantum computer --

Once you've made an account you can design your circuit and queue it up to simulate, or run on a real one.

Yep, we're in the future 🤖


In the quantum scale, things are a bit different. At that scale, we say that an object does not have a definite position and velocity. It has instead, a probability space. This is sometimes called a Wavefunction. When an observation is made, the wavefunction collapses. Which means, the object happens to be in a definitive state. The definition of an observation, is dependant on the situation.

Anyway, so that means, basically the state of a bit can be represented by a probability space, from 0 to 1. That's why the simpler sources say; a qubit can be 0, 1 or anything in between.

So how does that help us in computing? Technically stepping into the quantum realm is just a side product of building smaller and smaller scale integrated circuits. Right now we are at ~10 nm (nanometers). The smaller you go, the more definite you step into the quantum realm because electricity doesn't work the way it does on a grand scale. So you go quantum. You have to, anyway.

The power of quantum computing comes from the continous probability space. Normal computers have a probability space of 2 outcomes. 1 and 0. A quantum qubit has ... infinite. So, if you want to build a computer with normal bits, you need millions of transistors. If you want to build it with qubits, you need only a handful.

The rest of the mathematics are way above my level, though, but in essense, the goal is to create a Turing Machine. However, wave collapse is erratic and not that simple, albeit, it is almost instantanous.


Many people can and have written papers, blogs and even book's on the topic of Quantum Computing. The level of background knowledge required to understand Quantum Computing is high, but not unachievable. The topic is so vast that no comment could truly explain the topic.
Now i'm no authority on Quantum Computing but the

Tl;DR is that Quantum Computing, is new paradigm of computing. Which is based on natural Mathematics and can vary probabilistic in nature. In essence allowing the use of algorithms that Classical Computing find hard or vary resource intensive. That said Quantum Computing, will not fully replace Classical Computing, but will run alongside it.

If you want to learn more about Quantum Computing

  • Learn to research if you don't all ready know and then do a lot of research, but be careful and don't believe everything you read. verify the information your reading.
  • If you live near a university look to see if they have evening lecturers open the public as many do, they may just have a few on Quantum Computing
  • If your up to it engage in the online community learning about Quantum Computing
  • Practice and play with Quantum Computing, it's normally free and IBM provide free access to their cloud quantum computers.

This video from Microsoft research goes over the actual fundamental math that explains qubits and quantum computing:

If you have an understanding of linear algebra it isn't too too hard to follow along.


Probably the shortest and easiest definition is the following:

  • in classical computing we use electricity to model bits (0s and 1s) and then perform computations

  • in quantum computing we use quantum elements (there are different implementations possible: photons, superconducting stuff, etc.) and their quantum mechanical phenomena (entanglement, superposition, etc.) to model something called qubits (complex linear combinations of 0s and 1s) and then perform computations

Basically it is not better or faster or whatever it's simply a different computational model. This is not black magic. lots of buzzwords recently but really this is just another model of computation which is pretty damn interesting to learn and quite mind-bending :)


Cool thread!
You can also check my post about taking a look to available QC Python packages:

Classic DEV Post from May 21

Ten Cognitive Biases to Look Out For as a Developer

Cognitive biases can be viewed as bugs in our thinking. In this blog post we want to take a look at ten cognitive biases to look out for as a developer.

Jess Lee profile image now has dark mode.

Go to the "misc" section of your settings and select night theme ❤️