DEV Community

Cover image for Binary: The Language Your Computer Actually Speaks
Walter Nascimento
Walter Nascimento

Posted on

Binary: The Language Your Computer Actually Speaks

Everything in software engineering eventually collapses into something extremely simple:

0 and 1
Enter fullscreen mode Exit fullscreen mode

Not strings.
Not objects.
Not JSON.
Not even machine instructions in the way we imagine them.

Just electrical states.

If you truly want to understand how computers work, you need to understand binary and how it eventually becomes text through something like ASCII.

Let’s break it down from the absolute beginning.


What Is Binary?

Binary is a base-2 number system.

While humans use base-10 (0–9), computers use:

0
1
Enter fullscreen mode Exit fullscreen mode

Why?

Because hardware operates on physical states:

  • Off / On
  • Low voltage / High voltage
  • False / True

Transistors, the building blocks of CPUs can only represent two stable states. That maps perfectly to binary digits.

That’s why computers don’t "prefer" binary.

They are physically built around it.


What Is a Bit?

A bit (binary digit) is the smallest unit of information.

It can only be:

0
or
1
Enter fullscreen mode Exit fullscreen mode

Alone, a bit isn’t very useful.

But when you combine them, something powerful happens.


What Is a Byte?

A byte is a group of 8 bits.

Example:

00000000
Enter fullscreen mode Exit fullscreen mode

With 8 bits, we can represent:

2^8 = 256 possible values
Enter fullscreen mode Exit fullscreen mode

That means:

0 to 255
Enter fullscreen mode Exit fullscreen mode

This is the first important turning point.

Because now we can represent numbers.

And once we can represent numbers, we can represent anything.


How Binary Represents Numbers

Binary works exactly like decimal except each position represents a power of 2 instead of 10.

Example:

1010₂
Enter fullscreen mode Exit fullscreen mode

Each position means:

1×2³ + 0×2² + 1×2¹ + 0×2⁰
Enter fullscreen mode Exit fullscreen mode

Which equals:

8 + 0 + 2 + 0 = 10
Enter fullscreen mode Exit fullscreen mode

So:

1010₂ = 10₁₀
Enter fullscreen mode Exit fullscreen mode

It’s not magic.

It’s just positional notation.


So How Do Numbers Become Letters?

Here’s where things get interesting.

Computers don’t understand letters.

They understand numbers.

So we need a mapping:

Number → Character
Enter fullscreen mode Exit fullscreen mode

That mapping is called ASCII.


What Is ASCII?

ASCII stands for:

American Standard Code for Information Interchange

It’s simply a table that assigns numbers to characters.

For example:

Decimal Binary Character
65 01000001 A
66 01000010 B
97 01100001 a

So when you type:

A
Enter fullscreen mode Exit fullscreen mode

The computer stores:

65
Enter fullscreen mode Exit fullscreen mode

Which in binary is:

01000001
Enter fullscreen mode Exit fullscreen mode

That’s it.

There is no “A” inside the computer.

Only numbers.
Only bits.


Let’s Decode a Real Example

Take this binary sequence:

01001000 01101001
Enter fullscreen mode Exit fullscreen mode

Break it into bytes:

01001000 = 72
01101001 = 105
Enter fullscreen mode Exit fullscreen mode

Now check ASCII:

72  → H
105 → i
Enter fullscreen mode Exit fullscreen mode

The result?

Hi
Enter fullscreen mode Exit fullscreen mode

That’s how text exists inside memory.


The Full Journey: From Electricity to Text

Let’s connect everything:

  1. Electricity changes state (on/off).
  2. That becomes 0 or 1.
  3. Bits form bytes (8 bits).
  4. Bytes represent numbers (0–255).
  5. Numbers map to characters using ASCII.
  6. The operating system renders the character on your screen.

Every string in your code follows this pipeline.

Every log.
Every API response.
Every database record.

Still just bits.


But ASCII Isn’t Enough

The original ASCII uses 7 bits (128 characters).

That works for:

  • Basic English letters
  • Numbers
  • Symbols

But it doesn’t support:

  • Accents (é, ç, ñ)
  • Asian languages
  • Emojis

That’s why modern systems use:

  • Unicode
  • UTF-8

But even UTF-8?

Still binary underneath.

Always binary.


Why This Matters for Engineers

Understanding binary is not about memorizing conversions.

It’s about understanding abstraction layers.

When something breaks due to:

  • Encoding issues
  • Corrupted data
  • Serialization bugs
  • “Invalid byte sequence” errors

You are no longer debugging strings.

You are debugging binary representations.

The engineer who understands the foundation solves problems faster.

The one who ignores it stays confused at the surface.


Final Thought

Computers don’t understand:

  • Code
  • APIs
  • JSON
  • Databases
  • Frameworks

They understand:

0 and 1
Enter fullscreen mode Exit fullscreen mode

Everything else is abstraction.

And the better you understand the bottom layer,
the stronger your engineering intuition becomes.


Thanks for reading!

If you have any questions, complaints or tips, you can leave them here in the comments. I will be happy to answer!
😊😊 See you! 😊😊


Support Me

Youtube - WalterNascimentoBarroso
Github - WalterNascimentoBarroso
Codepen - WalterNascimentoBarroso

Top comments (0)