Unix Timestamp Converter

Convert any Unix timestamp to a readable date, or any date to its Unix timestamp. Includes key timestamp references, Year 2038 explanation, and code examples.

Enter your values above to see the results.

Tips & Notes

  • Get the current Unix timestamp in code: Python: int(time.time()) | JavaScript: Math.floor(Date.now()/1000) | PHP: time() | MySQL: UNIX_TIMESTAMP() | Bash: date +%s.
  • Unix timestamps are always in UTC — they contain no time zone information. Convert to local time at the display layer only; always store and transmit in UTC.
  • JavaScript Date.now() returns milliseconds (13 digits), not seconds (10 digits). Divide by 1000 before comparing with second-based APIs or storing in databases.
  • Use BIGINT (64-bit integer) for timestamp columns in databases, not INT (32-bit) — 32-bit columns overflow on January 19, 2038.
  • To calculate elapsed seconds between two events: simply subtract the earlier timestamp from the later one. No date parsing required — this is the primary advantage of Unix timestamps.

Common Mistakes

  • Confusing second timestamps (10 digits) with millisecond timestamps (13 digits) — a millisecond timestamp looks like a date about 1000 years in the future when interpreted as seconds.
  • Adding a time zone offset to a Unix timestamp — timestamps are already UTC. Apply the zone offset at display time only; modifying the stored timestamp breaks its universal meaning.
  • Storing timestamps in 32-bit INT database columns — these overflow on January 19, 2038. Always use BIGINT for new systems.
  • Generating timestamps with Date.now() and comparing directly to a seconds-based API — Date.now() returns milliseconds; divide by 1000 before any comparison.
  • Assuming two timestamps are directly subtractable when one is in seconds and the other in milliseconds — always normalize to the same unit before arithmetic.

Unix Timestamp Converter Overview

A Unix timestamp is the number of seconds elapsed since January 1, 1970, 00:00:00 UTC — the universal reference point used by virtually every computer system, database, and API in the world. This single integer converts to any human-readable date and time instantly, making it the most portable and unambiguous format for storing and transmitting timestamps across systems and time zones.

Converting a Unix timestamp to a human date:

Human Date = Unix Epoch (Jan 1, 1970 UTC) + Timestamp Seconds
EX: Timestamp 1700000000 → Add 1.7 billion seconds to Jan 1, 1970 → November 14, 2023, 22:13:20 UTC
Key Unix timestamps — reference points:
Unix TimestampHuman Date (UTC)Significance
0January 1, 1970, 00:00:00The Epoch — origin of Unix time
1,000,000,000September 9, 2001, 01:46:401 billion seconds since Epoch
1,234,567,890February 13, 2009, 23:31:30Memorable timestamp celebrated by programmers
1,700,000,000November 14, 2023, 22:13:20Recent milestone
2,147,483,647January 19, 2038, 03:14:07Maximum 32-bit signed integer — Year 2038 Problem
Common timestamp intervals in seconds:
DurationSecondsCommon Use
1 minute60Session timeout
1 hour3,600Token expiry
1 day86,400Cache duration
1 week604,800Cookie expiry
30 days2,592,000Subscription period
1 year31,536,000Certificate validity
The Year 2038 Problem is the timestamp equivalent of Y2K: 32-bit systems store Unix timestamps as signed integers, which overflow at 2,147,483,647 on January 19, 2038. Systems still using 32-bit time representation will misinterpret subsequent dates as December 13, 1901. Modern 64-bit systems extend the valid range to approximately year 292 billion — far beyond any practical concern for current software development.

Frequently Asked Questions

A Unix timestamp is the number of seconds elapsed since January 1, 1970, 00:00:00 UTC — the Unix epoch. For example, timestamp 1700000000 represents September 14, 2023, 16:53:20 UTC. It is used universally in computing because a single integer unambiguously represents any moment in time regardless of time zones, daylight saving rules, or calendar conventions. Every computer system, database, and programming language supports Unix timestamps natively.

Check the digit count first: 10-digit timestamps are in seconds (e.g., 1700000000); 13-digit timestamps are in milliseconds (e.g., 1700000000000). Divide milliseconds by 1000 to get seconds. Then: seconds since epoch ÷ 86400 = days since January 1, 1970. Add those days to the epoch date accounting for leap years. In code: Python uses datetime.utcfromtimestamp(ts); JavaScript uses new Date(ts * 1000).toISOString().

Unix timestamps in seconds have 10 digits (e.g., 1700000000). Millisecond timestamps have 13 digits (e.g., 1700000000000). JavaScript uses milliseconds by default; most other languages use seconds. Treating a millisecond timestamp as seconds gives a date approximately 554 years in the future (year 2023 becomes year 55,792). Always identify the unit by digit count before converting. Divide by 1000 to convert milliseconds to seconds.

January 1, 1970 was chosen because Unix was developed in the late 1960s and early 1970s, and developers needed a recent, round reference point that predated their work. It was a practical choice, not mathematically special. Other systems use different epochs: Windows FILETIME counts from January 1, 1601; GPS time counts from January 6, 1980; Apple Core Data uses January 1, 2001. Always verify the epoch when working across different systems.

32-bit signed Unix timestamps can store values up to 2,147,483,647 seconds, which corresponds to January 19, 2038, 03:14:07 UTC. One second later, the value overflows to a large negative number, representing December 13, 1901. Systems using 32-bit timestamps for dates will malfunction on that date — a problem similar to Y2K. All modern systems should use 64-bit timestamps, which extend the usable range to approximately year 292 billion.

In Python: import time; int(time.time()). In JavaScript: Math.floor(Date.now() / 1000) for seconds, or Date.now() for milliseconds. In a Unix/Linux terminal: date +%s. In MySQL: SELECT UNIX_TIMESTAMP(). In PostgreSQL: SELECT EXTRACT(EPOCH FROM NOW()). In PHP: time(). All return the current time as seconds since the Unix epoch. The value increments by 1 every second and is identical across all correctly synchronized systems worldwide.