Use this Unix timestamp converter to convert epoch timestamps into human-readable dates. You can also convert dates back into Unix time in seconds or milliseconds. The tool shows UTC, local time, and ISO 8601 formats.
How to use the Unix timestamp converter
Enter a Unix timestamp in seconds or milliseconds and click Convert timestamp. The tool will display the corresponding date in UTC, local time, and ISO 8601 format. You can also enter a readable date and convert it back into a Unix timestamp.
What is a Unix timestamp?
A Unix timestamp, also called epoch time, represents the number of seconds that have passed since January 1, 1970 at 00:00:00 UTC. It is widely used in operating systems, programming languages, databases, and APIs to represent time in a standardized numeric format.
Examples
Example Unix timestamps:
1700000000 → Tue, 14 Nov 2023 22:13:20 GMT1710000000 → Sat, 09 Mar 2024 16:00:00 GMT
Frequently asked questions
A 13-digit Unix timestamp is measured in milliseconds. The standard Unix timestamp counts seconds since January 1, 1970 at 00:00:00 UTC and produces a 10-digit number.
Multiplying by 1,000 converts seconds to milliseconds, adding three more digits. Many modern environments use milliseconds by default:
- JavaScript —
Date.now()returns milliseconds - Java —
System.currentTimeMillis() - Many REST APIs — return timestamps in milliseconds
Linux, Python, and most Unix tools use seconds by default. This tool detects both formats automatically.
The difference is precision and scale:
- Seconds (10 digits) — example:
1710000000 - Milliseconds (13 digits) — example:
1710000000000
To convert between them:
- Seconds → Milliseconds: multiply by
1000 - Milliseconds → Seconds: divide by
1000
Use seconds when working with Linux, shell scripts, Python (time.time()), or databases. Use milliseconds when working with JavaScript, browser APIs, or high-precision timing.
Enter the Unix timestamp in the Unix timestamp field above and click Convert timestamp. The tool displays the result in three formats:
- UTC — Coordinated Universal Time, timezone-independent
- Local — your browser’s local timezone
- ISO 8601 — standard machine-readable format used in APIs and logs
Both 10-digit (seconds) and 13-digit (milliseconds) timestamps are detected automatically.
Use the Readable date and time field above. Select or type a date and time, then click Convert date. The tool returns the corresponding Unix timestamp in both seconds and milliseconds.
The date input uses your local browser time. Click Now to instantly convert the current moment to a Unix timestamp.
The Unix epoch is the reference point for all Unix timestamps: January 1, 1970 at 00:00:00 UTC.
A Unix timestamp simply counts how many seconds (or milliseconds) have passed since that moment. This makes timestamps timezone-independent and easy to compare, sort, and store in databases.
The epoch was chosen when Unix was developed in the early 1970s and is now the universal standard across operating systems, programming languages, and network protocols.
ISO 8601 is an international standard for representing dates and times as strings. Example: 2024-03-09T16:00:00Z
Format breakdown: YYYY-MM-DDTHH:MM:SSZ
Tseparates the date from the timeZmeans UTC (zero offset)
ISO 8601 is widely used in APIs, log files, and databases because it is unambiguous, sortable, and machine-readable.