DEV Community

Cover image for Unix Timestamps Explained: What Every Developer Should Know
Sven Schuchardt
Sven Schuchardt

Posted on • Originally published at biztechbridge.com

Unix Timestamps Explained: What Every Developer Should Know

You're tailing a log file and you see this:

[1712700000] ERROR: connection timeout
Enter fullscreen mode Exit fullscreen mode

What is 1712700000? Is it a bug? A timestamp? A version number? If you've ever stared at a number like that and felt unsure, this article is for you.

By the end you'll know exactly what Unix timestamps are, why every serious API uses them, and how to convert them instantly without memorising any formula.


What Is a Unix Timestamp?

A Unix timestamp (also called an epoch timestamp) is simply the number of seconds that have elapsed since January 1, 1970, 00:00:00 UTC — a moment arbitrarily chosen as the starting point of computer time, known as the Unix epoch.

That's it. No timezones, no daylight saving adjustments, no locale quirks. Just a single integer that means the same thing on every machine on the planet.

1712700000 translates to April 9, 2024, 20:00:00 UTC.

Why 1970?

The Unix operating system was developed in the early 1970s. The designers needed a fixed reference point that was recent enough to keep numbers small but old enough to cover any historical dates they cared about. January 1, 1970 was a clean, round choice that stuck — and 50+ years later the entire industry still uses it.

Seconds vs Milliseconds — the most common gotcha

Two variants exist and they will burn you if you mix them up:

Format Example Used by
Seconds (Unix time) 1712700000 Most Unix APIs, databases, server logs
Milliseconds 1712700000000 JavaScript's Date.now(), Java, many web APIs

A 13-digit number is almost always milliseconds. A 10-digit number is almost always seconds. When in doubt, check the API docs.

The 2038 problem (a quick aside)

32-bit systems store Unix timestamps as a signed integer, which maxes out on January 19, 2038. Most modern systems use 64-bit integers (which won't overflow until the year 292 billion), but if you're working with embedded systems or legacy C code, it's worth knowing.


Converting Epoch to a Human-Readable Date in JavaScript

No library needed. JavaScript's built-in Date handles it in one line:

// If your timestamp is in seconds, multiply by 1000 first
const ts = 1712700000;
const date = new Date(ts * 1000);

console.log(date.toISOString());
// → "2024-04-09T20:00:00.000Z"

console.log(date.toLocaleString('en-US', { timeZone: 'America/New_York' }));
// → "4/9/2024, 4:00:00 PM"
Enter fullscreen mode Exit fullscreen mode

Going the other way — current time as epoch — is even simpler:

const nowInSeconds = Math.floor(Date.now() / 1000);
console.log(nowInSeconds); // → 1712700000 (approximately)
Enter fullscreen mode Exit fullscreen mode

A debug helper worth bookmarking

When you're debugging logs, paste this into your browser console:

const fromEpoch = (ts) => {
  // Handle both seconds and milliseconds automatically
  const ms = ts > 1e12 ? ts : ts * 1000;
  return new Date(ms).toISOString();
};

fromEpoch(1712700000);    // → "2024-04-09T20:00:00.000Z"
fromEpoch(1712700000000); // → "2024-04-09T20:00:00.000Z"
Enter fullscreen mode Exit fullscreen mode

Try It Without Writing Any Code

If you just need a quick conversion — during debugging, code review, or reading API docs — paste any timestamp into our free epoch converter tool. It handles both seconds and milliseconds, supports timezone selection, and works entirely in your browser. No data is ever sent to a server.


Why APIs Use Unix Time (and Not ISO Strings)

You'll notice that Stripe, GitHub, Slack, and virtually every major API returns timestamps as integers, not formatted date strings. There are good reasons:

  • No timezone ambiguity1712700000 is the same moment everywhere; "2024-04-09 20:00:00" is meaningless without a timezone
  • Easy arithmetic — want to check if something happened in the last 24 hours? now - ts < 86400. Try doing that with ISO strings.
  • Compact — 10 digits vs 24 characters for ISO 8601
  • No parsing edge cases — no locale formats, no AM/PM, no separator variations

The tradeoff is readability — which is exactly why tools and debug helpers exist.


Key Takeaways

  • A Unix timestamp is seconds since January 1, 1970 UTC
  • 10 digits = seconds, 13 digits = milliseconds — multiply by 1000 before passing to new Date()
  • APIs use Unix time because it's unambiguous, compact, and arithmetically convenient
  • Convert any timestamp instantly with our epoch converter tool

Further Reading

Top comments (0)