Schema

Validation

Paste an Avro schema on the left to see validation results here.

What is the Avro Validator?

Pushing a broken .avsc to a Schema Registry usually fails with a terse error like "schema is invalid" — and you are left squinting at a four-level nested record trying to figure out which field caused it. This validator gives you the path. Paste an Avro schema on the left and the right panel tells you exactly which record is missing a name, which enum has an empty symbols array, or which union branch references a type that was never declared.

It checks the structural rules from the Apache Avro 1.11 specification: every record must have type, name, and a fields array; primitives must be one of the eight canonical names (null, boolean, int, long, float, double, bytes, string); enums need a non-empty symbols list; arrays need items; maps need values; fixed needs a numeric size; and named-type references inside unions must resolve to something declared earlier. The whole file is plain JSON underneath (per RFC 8259), so JSON syntax errors are surfaced first using the browser native JSON.parse().

What It Catches

Real-time Validation

As you type or paste, the validator runs in the background with a 500ms debounce. No "Validate" button to click — results appear as soon as you stop typing.

JSON-Path Errors

Every error tells you exactly where in the schema it is, using a JSON-path-style location like $.fields[3].type or $.items.symbols. No more counting braces.

Upload .avsc Files

Drag in an .avsc or .json file straight from your repo. Useful when reviewing a schema PR locally before it goes upstream.

Sample & Counter-example

A valid Order schema with nested OrderItem and a union-typed shippingAddress, plus a deliberately broken counter-example to show what bad schemas look like.

Syntax Highlighting

Ace editor with JSON syntax highlighting on the left so you can spot unbalanced braces and stray commas before the structural pass even runs.

Stays in Your Browser

Nothing is uploaded. Validation runs entirely in the page using JSON.parse() and a recursive walker. Safe for proprietary schemas.

How to Use It

1

Paste or Upload Your Schema

Drop your .avsc into the left panel. The Sample button loads a working Order schema with records, enums, and a union; the Invalid Sample shows what a broken schema looks like (missing fields, undeclared type references, empty enum symbols).

2

Read the Results

The right panel shows a green "Valid Avro schema" banner if everything checks out. Otherwise you get a red banner with the count and a numbered list of problems, each with the exact JSON path where the issue lives.

3

Fix and Re-validate

Edit the schema in the left panel and the validator re-runs automatically. Once you see green, copy the schema back out and ship it.

When You Would Actually Use This

Pre-PR Schema Lint

Before opening a PR that adds a new event schema, paste the .avsc here. Catching a missing record name now saves a back-and-forth review cycle. Same idea as running a linter before pushing.

Schema Registry Failures

A Kafka deployment with Confluent Schema Registry rejected your schema with "Invalid schema". Drop the body in here and the validator points at the exact field. Usually it is a typo in a primitive name (integer instead of int) or a forward reference to a record that has not been declared yet.

Hand-written Schemas

Generating Avro by hand for a quick prototype? Easy to forget that fixed needs a size, or that union branches are an array of types not a single object. The validator catches all of these in one pass so you can iterate fast.

Reviewing AI-generated Schemas

Asked an LLM to draft a schema for an Order event? It will look right at a glance but often gets enum symbol arrays empty, or uses "type": "string[]" instead of an array type. Validate before trusting.

Common Questions

Does this validate Avro data, or just schemas?

Just schemas. The tool checks the structural rules of an .avsc file — that records have type/name/fields, primitives are canonical, enums have symbols, and so on. Validating data against a schema is a separate problem and needs the actual Avro library because it involves binary decoding and type resolution.

Will it catch every Avro spec violation?

It catches the structural ones — the kind that would crash a parser. Things like enum-symbol naming rules (must match [A-Za-z_][A-Za-z0-9_]*), reserved word collisions, and full namespace resolution across nested records are not 100% covered. For an authoritative pass, run the Apache Avro library or the Schema Registry compatibility check.

What does "unknown type" mean in the error list?

Avro lets a record reference a previously declared named type (record, enum, fixed) by its name. If the validator sees a string type that is not one of the eight primitives and was not declared earlier in the schema, it flags it as unknown. Usually means a typo or a forward reference that needs to be inlined.

Are my schemas sent anywhere?

No. Validation runs entirely in your browser. The walker is plain TypeScript that visits each node of the parsed JSON object. Nothing leaves the page, no logging, no analytics on the schema content.

How is this different from the Avro Viewer?

The viewer pretty-prints. The validator checks structure. The viewer will happily render a schema with a missing record name; the validator tells you that record name is missing.

Can it validate a schema that uses a logical type like decimal?

Yes. Logical types are written as a primitive type plus a logicalType annotation, e.g. {"type":"long","logicalType":"timestamp-millis"}. The validator walks the structure normally and treats logicalType as an extra annotation it does not need to understand. The base type still has to be valid.

Other Avro Tools

Validating is one piece of working with Avro. These tools cover the rest: