DEV Community

Cover image for About Bit, Bytes and Qubits
Caf RF
Caf RF

Posted on • Updated on

About Bit, Bytes and Qubits

For some time now I have been interested in knowing a little more about how computers and the Internet work. One of the things I didn't understand was the difference between bits and bytes. The story behind all this is very interesting!

The bit (binary digit) corresponds to a digit of the binary numbering system and represents the minimum unit of information. The storage capacity of a digital memory is also measured in bits. That is, a bit can be 0 or 1.

Gottfried Wilhelm Leibniz invented the binary system at the end of the 17th century. The idea? Convert certain linguistic concepts to logic, that is, interpret them as “true” or “false”

This is how a bit can represent two values such as true/false, on/off, etc. Following this logic, 2 bits can have up to 4 different combinations: 0-0, 0-1, 1-0, 1-1. 8 bits make up one octet and are equivalent to 256 different values.

Does 8 bits sound familiar to you?

Currently we say that 1 byte (Binary Tuple) is equivalent to 8 bits, but a byte and an octet may not be the same. While an octet always has 8 bits, a byte contains a fixed number of bits, but they must not be 8. In older computers, the byte was made up of different numbers of bits.

Before, computers processed information through ‘words’. These had a specific number of bits, to be exact, 10 bits since decimal precision was prioritized academically.

But having 10 digits was a lot of 'waste of resources'. The German Werner Buchholz came to the conclusion that each character had to be addressed individually, with a fixed reference of 8 adjacent bits per unit.

Therefore, 1 byte can be a number, a letter or a symbol and represents one letter of the character code in the binary system in all computing applications.

You can check this byte counter:
Atatus

And this string to binary converter online tool where you can count the bytes as well:
codebeautify

But everything is constantly changing and evolving, which brings me to the next question.
Is there anything that could replace the bit in the future?

This is when we reach a more complex world, the quantum world.

A qubit (quantum bits) is based on quantum theory and is a very interesting topic, but to give a brief introduction, in the quantum world a simple action can have different possibilities at the same time and so a qubit is based on some basic principles of quantum physics, which are: superposition and entanglement.

A qubit is the bit of quantum computing, since it represents two base states, 0 and 1, but the big difference is that a bit can have the value of 0 or 1, and a qubit has both values at the same time, which saves a lot of time when performing parallel calculations. The surprising thing is that the qubit is not as new as it seems, but rather the first concepts were made known in 1968 thanks to Stephen Wiesner with his invention 'conjugate coding'.

Top comments (0)