ToolCozy
JSON Format / DiffHOTBase64 CodecHOTURL CodecNEWTimestamp ConverterNEWUUID GeneratorHash CalculatorJWT DecoderQR Code GeneratorUnit ConverterNumber BaseData ConverterSQL FormatterIP Lookup
Image CompressHOTColor ConverterImage ConvertImage CropNEWApp Store ScreenshotNEW
Regex TesterNEWWord CounterText DiffMarkdown Preview

More Products

Playbit Games

Free online HTML5 games — play instantly in your browser

Kaola Screenshot

App Store screenshot generator with device frames & templates

Pillease

Simple pill reminder app — never miss a dose again

© 2026 ToolCozy·Privacy·Feedback

Timestamp Converter

Unix timestamp ↔ human-readable date/time

Current Unix Timestamp

—

Timestamp → Date/Time

Date/Time → Timestamp

About this tool

A Unix timestamp is the number of seconds (or milliseconds) elapsed since 1970-01-01 00:00:00 UTC. It's the lingua franca of timestamps in databases, log files, APIs, and programming languages — easy to compare, easy to sort, time-zone agnostic.

This tool converts Unix timestamps to human-readable date/time and back, in both seconds and milliseconds. All conversion happens in your browser using the native Date API.

How to use

Timestamp → date

  1. Paste a numeric timestamp.
  2. Pick Seconds or Milliseconds depending on the source (10 digits ≈ seconds, 13 digits ≈ milliseconds).
  3. Read the local time, UTC time, and ISO 8601 representations side-by-side.

Date → timestamp

  1. Pick a date and time, or click Now to use the current moment.
  2. Both seconds and milliseconds outputs are shown — copy whichever your system expects.

Examples

Seconds → date

Input
1700000000
Output
Local: 2023-11-15 06:13:20
UTC:   2023-11-14 22:13:20
ISO:   2023-11-14T22:13:20.000Z

Milliseconds → date

Input
1700000000000
Output
Local: 2023-11-15 06:13:20.000
UTC:   2023-11-14 22:13:20.000
ISO:   2023-11-14T22:13:20.000Z
Frequently asked questions
Seconds or milliseconds — how do I tell?

By length. Today's timestamps are around 10 digits in seconds and 13 digits in milliseconds. If your number has 10 digits, it's seconds; 13 digits, it's milliseconds. JavaScript Date uses milliseconds; most Unix CLIs and Linux APIs use seconds.

What's the Y2K38 problem?

On 2038-01-19 03:14:07 UTC, a 32-bit signed integer used to store seconds-since-epoch overflows. Modern 64-bit systems and JavaScript (which uses double-precision floats) aren't affected, but legacy embedded systems and old database columns may need migration.

How are time zones handled?

Unix timestamps are always UTC by definition — no time zone is encoded. The 'Local' column above is rendered using your browser's time zone for convenience. The actual instant is the same everywhere.

Does it support negative timestamps?

Yes — values before 1970 are negative. -86400 means one day before the epoch (1969-12-31 UTC). Useful for historical date math.

Why does the local time look wrong?

Almost always a time-zone misunderstanding. Check your system clock and time zone setting. The ISO and UTC columns are the source of truth — Local is just your browser's interpretation.