DEV Community

Pulkit (Ryo)
Pulkit (Ryo)

Posted on • Edited on

The Restart of Time

Prerequisites:

  • A basic knowledge of bits, bytes, and words

Refresher:

  1. Intro to Binary Numbers | CSAnim
  2. Intro to Binary Integers | Aidi Rivera

Have you ever asked yourself how time is represented in a computer? How about whether we're on the verge of running out of bits to store said time?

If yes, fantastic. Let's get to the bottom of it. If not, I hope your interest is piqued. Keep reading and we can learn something together!


The Representation of Time

We'll start with the simpler question, the representation of time in bits.

In computer science, we generally record & store time in milliseconds relative to January 1st, 1970.

Knowing this, you may find yourself wondering:

Well there are 1000 ms in EVERY second and that has to add up fast. We're already 50 years past the epoch and that's a lot of milliseconds! Won't we run out of bits soon, say in a few decades?

Good news is that we've little to worry about and even better, I'll show you why.


The Magnitude of Time in Millis

Let's get a feel for the scale of milliseconds.

  • 1 sec = 1,000 ms
  • 1 min = 60 sec = 60,000 ms
  • 1 day = 1,440 min = 86,400,000 ms
  • 1 year = 365 days = 31,536,000,000 ms

Hmmmmm... so ~32 billion milliseconds go by each year.

Seems like a rather large value for a single year, doesn't it?


The Magnitude of the Container

Now we'll get a feel for the scale of our data types which will hold the milliseconds elapsed since a given date.

While word sizes can be 32 or 64 bits, we forgo one bit in order to represent negative values, i.e. time before the Unix epoch.

Signed Max Values

  • 32-bit word: 2^31 = 2,147,483,648 ≈ 2.15E9
  • 64-bit word: 2^63 = 9,223,372,036,854,775,808 ≈ 9.2E18

Since the number of milliseconds in a year is 3.15E10, we've already surpassed the size of a 32-bit word. But there's still hope!


A 64-bit word can represent a value several magnitudes higher than a year of milliseconds, i.e. 1018 vs 1010, but just how many years?

9,223,372,036,854,775,808 ms ÷ 31,536,000,000 ms/yr = 292,471,208.68 years

A 64-bit word can represent time a colossal 2.9E8 years to either end of a given date, i.e. ~290 million years in the past or future.

So maybe if humans are still around by 268,435,456 AD (2^28), we'll need a v2 of the Unix epoch, the restart of time.

Perhaps we'll decide to reset at a happier number, e.g. 250,000,000 AD.

I imagine by then our processors will support much higher word sizes, e.g. x256, x2048, and beyond. I personally can't fathom practical uses of word sizes over 128-bits aside from cryptography. However, intergalactic conquest & expansion may prove this point moot.

In any case, we can put our minds at ease. The restart of time of time won't happen on our watch. 🥁

Until Next Time,
Pulkit


References:

  1. System API | JDK 16 Docs
  2. Unix Time | Narrative.io
  3. Why is 1/1/1970 the “epoch time”? | StackOverflow
  4. Why are dates calculated from January 1st, 1970? | StackOverflow

Further Reading:

  1. Basics of Bit Manipulation | HackerEarth

Published on Medium

Top comments (1)

Collapse
 
jmfayard profile image
Jean-Michel 🕵🏻‍♂️ Fayard

Hello Pulkit, thanks for your story, I found it very enjoyable.