Unix Timestamp Converter
Current Unix Timestamp
Seconds since 1 January 1970 00:00:00 UTC
Convert Unix Timestamp
What is a Unix Timestamp?
A Unix timestamp (also called Epoch time or POSIX time) is the number of seconds that have elapsed since Thursday, 1 January 1970, 00:00:00 UTC. It is widely used in programming to represent dates and times as a single integer.
What is Unix Time?
Unix time - Also called the Unix epoch, POSIX time, or epoch time - Is a system for representing points in time as a single large integer: the number of seconds that have elapsed since midnight on January 1, 1970, UTC. That specific moment is called the Unix epoch or "epoch zero." At epoch zero, the Unix timestamp is exactly 0. Every second that passes, the number increments by one, giving any moment in history a unique, sortable, and timezone-independent numeric identifier.
It is worth noting that Unix time counts seconds, not milliseconds. This trips up many developers because JavaScript's Date.now() returns milliseconds since the epoch - A value roughly 1,000 times larger than the Unix timestamp shown on this page. To convert a JavaScript timestamp to Unix time, divide by 1000 (and floor the result). This distinction matters whenever you are storing times in a database or passing them between a frontend and a backend, because a millisecond value stored in a Unix timestamp field will be interpreted as a date nearly 50,000 years in the future.
Why 1970? The Unix operating system was being actively developed at Bell Labs in the late 1960s. When engineers needed a reference date for the time system they were designing, they chose January 1, 1970 as a round, recent, and computationally convenient starting point. The choice was somewhat arbitrary - Other systems have used different epochs (Apple's Classic Mac OS used January 1, 1904; Windows FILETIME uses January 1, 1601) - But Unix's influence on computing made its epoch the de facto standard for most modern software. Need to convert between time zones for a timestamp? Use the Time Zone Converter.
Unix Timestamp Milestones
Notable timestamps throughout the history and future of Unix time.
| Unix Timestamp | Date (UTC) | Significance |
|---|---|---|
| 0 | 1970-01-01 00:00:00 | The Unix epoch - The origin of Unix time |
| 1,000,000,000 | 2001-09-09 01:46:40 | The "Unix Billennium" - One billion seconds since the epoch, celebrated by developers worldwide |
| 1,234,567,890 | 2009-02-13 23:31:30 | A notable sequential timestamp - 1,234,567,890 seconds since epoch |
| 1,500,000,000 | 2017-07-14 02:40:00 | 1.5 billion seconds since the epoch |
| 2,000,000,000 | 2033-05-18 03:33:20 | Two billion seconds since the epoch - Still in the safe range for 32-bit systems |
| 2,147,483,647 | 2038-01-19 03:14:07 | Maximum value of a signed 32-bit integer - The "Year 2038 Problem" overflow point |
The Year 2038 Problem
Many older systems and embedded devices store Unix timestamps as a 32-bit signed integer. A signed 32-bit integer can hold values from -2,147,483,648 to +2,147,483,647. Unix time will reach exactly 2,147,483,647 at 03:14:07 UTC on January 19, 2038. One second later, the value will overflow - Wrapping from the maximum positive value to the minimum negative value, which corresponds to December 13, 1901. Systems that have not been updated to use 64-bit timestamps may interpret any date past that moment as a date nearly 137 years in the past, causing catastrophic errors in logging, scheduling, financial calculations, and security certificate validation.
The good news is that modern operating systems and programming languages have largely addressed this by moving to 64-bit timestamps, which can represent values up to approximately 9.2 quintillion seconds. That ceiling corresponds to a date roughly 292 billion years from now - Safely beyond any practical concern for the foreseeable future. The risk remains in legacy embedded systems, older databases with 32-bit timestamp columns, and some network protocols that encode time in fixed-width 32-bit fields. The problem is analogous to the Y2K bug, but the consequences of ignoring it are significantly more severe for any system still relying on 32-bit time representations in 2038. Use the Date Calculator to find the exact number of days until January 19, 2038, or set a countdown to that date.
Unix Time in Different Languages
Every major programming language and database system provides a built-in way to get the current Unix timestamp. Here is the canonical one-liner in each.
import time; time.time()
returns a float (seconds with fractional part)
Math.floor(Date.now() / 1000)
Date.now() returns milliseconds - Divide by 1000
time()
returns an integer number of seconds
SELECT UNIX_TIMESTAMP();
works in SELECT, INSERT, and WHERE clauses
date +%s
uses the system date command with epoch format specifier
Frequently Asked Questions
What is the difference between Unix time and a Unix timestamp?
The terms are used interchangeably in most contexts. Technically, "Unix time" refers to the scale itself - The number of seconds since the epoch - While "Unix timestamp" refers to a specific value on that scale representing a particular moment. In practice, developers use both phrases to mean the same thing: the integer count of seconds since January 1, 1970. Use the Date Calculator to find the number of days between any two dates on this scale.
Why doesn't Unix time account for leap seconds?
Leap seconds are added to UTC periodically to account for irregularities in the Earth's rotation. Unix time, by definition, treats every day as exactly 86,400 seconds, ignoring leap seconds entirely. This means that Unix time is not strictly synchronized with UTC - They can diverge by up to the number of leap seconds that have been added since 1970 (27 as of 2024). For most applications this is irrelevant, but for precision scientific or telecommunications systems, it matters.
What is millisecond Unix time?
Millisecond Unix time is the number of milliseconds since the Unix epoch, rather than seconds. It is the format returned by JavaScript's Date.now() and is widely used in APIs and event logging where second-level precision is insufficient. To convert millisecond Unix time to standard Unix time, divide by 1,000 and take the floor. To go the other way, multiply Unix time by 1,000.
How do I convert Unix time in Excel?
Excel stores dates as the number of days since January 0, 1900. To convert a Unix timestamp in cell A1 to an Excel date, use the formula: =(A1/86400)+DATE(1970,1,1). Format the result cell as a date. If the value includes time, format as "yyyy-mm-dd hh:mm:ss". Note that Excel does not account for the 1900 leap year bug, so dates before March 1, 1900 may be off by one day.