Timestamp Converter — Unix Epoch to Date Online Tool

Convert Unix timestamps to human-readable dates and vice versa. Supports UTC and local time. Free, secure, and runs entirely in your browser.

Updated:

Loading tool…

How to use Timestamp Converter

To convert a Unix timestamp to a human-readable date, paste the epoch value (in seconds or milliseconds) into the Timestamp field and click Convert. The tool automatically detects whether the value is in seconds (10 digits) or milliseconds (13 digits) and displays the equivalent date and time in both UTC and your local timezone. Common sources of Unix timestamps include server logs, database records, API responses, JWT tokens, and JavaScript's Date.now().

To convert a date back to a Unix timestamp, switch to Date to Timestamp mode, enter or pick the date and time using the date picker, choose UTC or local time, and click Convert. The corresponding Unix timestamp in both seconds and milliseconds is displayed below. This is useful for constructing API queries with time filters, setting expiry values, or debugging time-based conditions in code.

Why use our Timestamp Converter?

  • Auto-detects seconds vs. milliseconds — no need to manually specify the unit
  • Shows both UTC and local timezone simultaneously
  • Bidirectional — convert timestamp to date and date to timestamp in the same tool
  • Displays milliseconds and seconds output side-by-side
  • 100% browser-based — values are never sent to any server
  • Displays UTC and local time side by side — no mental timezone arithmetic needed

Frequently Asked Questions

What is a Unix timestamp?

A Unix timestamp (also called epoch time or POSIX time) is the number of seconds that have elapsed since 00:00:00 UTC on January 1, 1970, known as the Unix epoch. It is the universal standard for representing moments in time in computer systems, because it is timezone-agnostic (always UTC), mathematically simple to compare and subtract, and compact enough to store as a single integer. Modern systems commonly use millisecond-precision timestamps (multiply seconds by 1000) for finer granularity.

How do I know if my timestamp is in seconds or milliseconds?

A 10-digit number (like 1708300800) is almost certainly in seconds and represents a date in the range 2001–2286. A 13-digit number (like 1708300800000) is almost certainly in milliseconds and represents the same range. Numbers shorter than 10 digits represent dates before 2001. Numbers longer than 13 digits are likely microseconds or nanoseconds. This tool auto-detects seconds vs. milliseconds based on the digit count, but you can override the detection manually if needed.

What is the Unix timestamp for right now?

You can get the current Unix timestamp in any programming environment: JavaScript: Date.now() (milliseconds) or Math.floor(Date.now()/1000) (seconds). Python: import time; time.time() (fractional seconds) or int(time.time()) (whole seconds). Linux/macOS terminal: date +%s. SQL: UNIX_TIMESTAMP() (MySQL) or EXTRACT(EPOCH FROM NOW()) (PostgreSQL). The current moment in time is always increasing — timestamps are monotonically increasing and never repeat.

What is the difference between UTC and local time?

UTC (Coordinated Universal Time) is the global time standard — it has no timezone offset and does not observe daylight saving time. All Unix timestamps are stored in UTC by definition. Local time is UTC adjusted by your timezone's offset. For example, UTC+7 (Thailand Standard Time) is 7 hours ahead of UTC, so midnight UTC is 7:00 AM local time. When converting timestamps for display in an application, always store internally as UTC and convert to local time only at the display layer.