URL

Validation Report

What is the URL Validator?

Paste a URL and get a structured report back: is it well-formed, what does each component look like, and what should you watch out for. The validator runs the URL through the browser's native URL constructor — which implements the WHATWG URL Standard — and then layers per-component checks on top.

A "valid URL" isn't just one that parses. A URL like http://10.0.0.5/admin?token=abc123 parses fine, but it has three things you probably want flagged: http:// instead of https://, an IP literal as the host, and a token in the query string. The validator surfaces all three as warnings, separate from the pass/fail check. The underlying syntax rules come from RFC 3986; the host-naming concerns come from RFC 1034 and operational experience.

Output is JSON, so you can pipe it into a CI script or a debug log. Everything runs in your browser — the URL never leaves your machine. If you want the URL broken into components without the validation layer, use the URL Parser instead. Internationalised domain names are handled per IDNA rules.

How to Use the URL Validator

Three steps. Each one matches a button on this page.

1

Paste a URL or Load the Sample

Drop a URL into the left panel. Click Sample to load a clean, well-formed URL with percent-encoded query parameters:

https://api.shop.example.com/v1/orders?customer=Ava%20Chen&status=active

The validator updates as you type. Try edge cases: http:// URLs, IP-literal hosts, URLs with credentials, single-label hosts (no TLD), Punycode domains. Each surfaces a different warning.

2

Read the Report

The right panel shows a JSON report with three top-level fields: isValid (the URL parsed at all), checks (per-component status — protocol, hostname, port, pathname), and warnings (advisory issues that aren't syntax errors but you probably care about).

3

Copy or Download

Click Copy to send the JSON report to your clipboard, or Download to save it as .json. Minify compacts the report onto one line if you need it for a log entry.

When You'd Actually Use This

Auditing a config file before deploy

Your service config has 40 URLs in it — webhook endpoints, OAuth callbacks, third-party integrations. One has an embedded password from a forgotten test, two are still pointing at http:// staging hosts. Pasting them through the validator one at a time catches all three before they ship. Dependencies on URL formatting also show up in the OAuth 2.0 spec for redirect URIs.

Reviewing user-submitted URLs in a form

A user submits a "website" field that turns out to be example — no protocol, no TLD, just a word. Or https://192.168.1.5 — looks valid, parses valid, but you almost certainly don't want to render that as a profile link. The validator surfaces both: missing-TLD warning on the first, IP-literal-host warning on the second.

Diagnosing why a redirect is failing

Your OAuth callback returns 400 with "invalid redirect_uri." The URL looks fine in your browser. Paste it into the validator and you spot it: the path has a literal space in it, and your auth provider compares strings byte-for-byte after canonicalisation. The warning ("path contains unencoded space") was the answer.

Spotting a Punycode-vs-Unicode mismatch

You expected to see münchen.example.com in the report and instead saw xn--mnchen-3ya.example.com. That's the Punycode form — what gets sent over the wire — and the validator flags it so you know the original input had non-ASCII characters. Useful when the user in a bug report copy-pasted a URL from an IDN domain.

Common Questions

What does "valid" actually mean here?

Two layers. isValid: true means the browser's URL constructor accepts the input — i.e. the syntax is well-formed per the WHATWG URL Standard. warnings are separate: things that are syntactically valid but probably not what you want (insecure protocol, IP literal, embedded credentials, missing TLD, etc.). A URL can be valid and still have warnings.

Does it check whether the URL actually resolves to something?

No — that would require a network request, and this tool runs entirely in your browser with no outbound calls. The validator checks syntax and surface heuristics only. For reachability checks, use curl -I or a dedicated uptime tool.

Why is http://example.com flagged as a warning?

Because in 2026 a plaintext URL is almost always a mistake — modern browsers warn users before submitting forms over http://, Google's "why HTTPS matters" covers the long version, and HSTS-preloaded domains will refuse to load over HTTP at all. The warning is advisory; if you really mean http:// (legacy intranet, local dev), ignore it.

What about relative URLs like /api/orders?

The URL constructor needs an absolute URL — without a base, it can't determine the protocol or host. The validator returns isValid: false with a clear error in that case. To validate a relative URL, prepend a base like https://example.com first.

Are credentials in URLs always wrong?

Almost. RFC 3986 §3.2.1 notes credentials in URLs are deprecated for security reasons. They end up in browser history, server access logs, and proxy caches. Modern browsers strip them silently from clipboard pastes. The validator flags them so you have an explicit record before they leak somewhere they shouldn't.

Does it care about IDN domains?

Yes — the validator notes whether the hostname is in Punycode form (xn--...) or pure ASCII. Browsers may display the Unicode form to users while transmitting the Punycode form on the wire, which is the source of IDN homograph attacks. Surfacing the Punycode is a small but useful signal.

Other URL & JSON Tools

Validation is one operation. Here's what else pairs naturally with it: