Glossary

What Is a Unix Timestamp?

A Unix timestamp (also called POSIX time or Epoch time) is the number of seconds that have elapsed since the Unix epoch: Thursday, January 1, 1970 00:00:00 UTC. It is a platform-independent, timezone-agnostic way to represent a specific moment in time — widely used in programming, databases, log files, and network protocols.

How Unix Timestamps Work

The Unix epoch (time zero) is 1970-01-01 00:00:00 UTC. Each passing second increments the timestamp by 1. As of early 2026, the current Unix timestamp is approximately 1,771,000,000. Unix timestamps count only seconds in UTC — they do not count leap seconds (UTC is adjusted to solar time by occasionally inserting or skipping a second).

Milliseconds and Microseconds

Many modern systems use millisecond-precision timestamps (Unix timestamp × 1000). JavaScript's Date.now() returns milliseconds since epoch. Some systems use microseconds (× 10^6) or nanoseconds (× 10^9). Always verify the unit when working with numeric timestamps from different systems — a timestamp of 1,771,000,000 in seconds vs. milliseconds differs by 3 years.

Timezones and Unix Timestamps

Unix timestamps are always UTC — they have no timezone. Converting a timestamp to a local date-time is a display concern, not a storage concern. When comparing two timestamps, no timezone conversion is needed. Always store timestamps in UTC and convert to local time only at display time. This eliminates daylight saving time bugs and makes log correlation across servers trivial.

The Year 2038 Problem

Legacy 32-bit signed integer timestamps overflow to negative on January 19, 2038 03:14:07 UTC (2^31 − 1 seconds after the epoch). Modern 64-bit systems extend this to the year 292,277,026,596. However, some embedded systems, old databases, and network protocols still use 32-bit time_t and will need remediation before 2038.