← Tools
Utilities

Unix Timestamp Converter

Convert between Unix timestamps and human-readable dates. All calculations happen locally.

Unix Timestamp (seconds)

About Unix Timestamps

A Unix timestamp is the number of seconds elapsed since 1 January 1970 00:00:00 UTC (the Unix Epoch). It is timezone-independent and used universally in computing, databases, and APIs. See: MDN Date reference. Read more: Timestamp Conversion Methods Across Programming Languages.

JavaScript's Date.now() returns milliseconds — divide by 1000 to get seconds.

Common Use Cases

  • Database timestamps — storing event times
  • JWT expiryexp is a Unix timestamp. Decode it with the JWT Debugger.
  • Log files — precise event ordering
  • Cache control — expiration times
  • Scheduling — future event planning

Key Facts

  • Always in UTC — convert to local time for display
  • 32-bit max is 2147483647 (Jan 2038)
  • JavaScript uses milliseconds internally
  • Negative values represent dates before 1970
  • Leap seconds are not counted

Frequently Asked Questions

Seconds vs milliseconds?

Unix timestamps are conventionally in seconds. JavaScript's Date.now() returns milliseconds. Divide by 1000 to get seconds; multiply by 1000 to feed into new Date().

What is the Y2K38 problem?

32-bit signed integers overflow on 19 January 2038. Modern systems use 64-bit integers and won't have this issue.

Are timestamps timezone-aware?

No. A Unix timestamp always represents a moment in UTC. Timezone conversion is done at display time.

Why are some APIs in milliseconds?

JavaScript popularised millisecond precision. Always check the API docs — look at the number of digits: 10 digits = seconds, 13 digits = milliseconds.