Avro Schema Validator
Deep validation of .avsc files against the Apache Avro 1.11 spec — every rule, every path
Schema
Validation
What is the Avro Schema Validator?
A regular JSON validator tells you if the file parses. An Avro schema validator tells you if the parsed file is actually a legal Avro schema. Two very different things. This page does the second one. Paste a .avsc on the left and the right pane lists every rule the schema breaks, each tagged with the JSON path where the problem lives — $.fields[2].type, $.fields[5].type[1], that kind of thing. The rules come straight from the Apache Avro 1.11 specification.
It catches the things that bite during a Schema Registry upload: a record without a name, an enum with an empty symbols array, a fixed with no size, a union that contains two "int" branches, a namespace like 1bad.name that does not parse as a dotted identifier, a default value that does not match the first branch of its union (the Avro union default rule trips up almost everyone the first time). It also surfaces logical type annotations on incompatible underlying types as warnings rather than errors. The whole file is plain JSON underneath (per RFC 8259) and parsing uses the browser native JSON.parse(), so JSON syntax errors come up first.
What It Catches
Real-time Validation
Edit the schema and the validator re-runs after a 300ms debounce. No Validate button — results stream as you type.
Path-based Errors
Each problem comes with a JSON path like $.fields[3].type[1] so you can jump straight to the offending node, even in a 200-line nested record.
Upload .avsc Files
Pull an .avsc straight off your filesystem. Useful when reviewing a schema PR locally before it heads to a Schema Registry.
Sample & Counter-example
A working Order schema with nested OrderItem and a union-typed shippingAddress, plus a deliberately broken counter-example to demonstrate the error reporting.
Syntax Highlighting
Ace editor with JSON syntax highlighting on the left so you spot stray commas and unbalanced braces before the structural pass even runs.
Stays in Your Browser
No upload, no server. Validation runs locally with JSON.parse() and a recursive walker. Safe for proprietary or unreleased schemas.
How to Use It
Paste, Upload, or Load a Sample
Drop your .avsc into the left panel. Sample loads a working Order schema with records, an enum, and a union; Invalid Sample loads a broken counter-example with a missing field name, an unknown primitive, and a duplicate union branch — so you can see how the errors look before pasting your own.
Read Errors and Warnings
The right panel shows a green "Valid Avro schema" banner if everything passes. Otherwise you get the count of errors and warnings, followed by a list. Errors block — your schema will not deserialize. Warnings are heads-ups — your schema will probably load but might surprise you (e.g. a logicalType on an int that the spec says should be on a long).
Fix and Re-validate
Edit the schema in the left panel; the validator re-runs automatically. Once you see green, copy the schema back out and ship it through your Schema Registry.
When You Would Actually Use This
Pre-PR Schema Lint
Before opening a PR that adds a new Avro event schema, paste it here. Catching a missing record name or an enum with empty symbols now saves a back-and-forth review cycle. Same idea as running ESLint on the way to commit.
Schema Registry Failures
Your Kafka deployment rejected the schema with a terse "schema is invalid". Drop the body in here and the validator points at the exact field. Usually it is a typo in a primitive name (integer instead of int) or a forward reference to a record that has not been declared yet.
Hand-written Schemas
Generating Avro by hand for a quick prototype? Easy to forget that fixed needs a numeric size, that union branches must be a JSON array (not a single object), or that the default value must match the FIRST branch of a union, not any branch. The validator surfaces all of these in one pass.
Reviewing AI-generated Schemas
Asked an LLM to draft an Order event schema? It will look right at a glance but often gets symbols arrays empty, writes "type": "string[]" instead of an array type, or duplicates "int" in a union. Run it through this validator before trusting the output.
Common Questions
How is this different from /avro-validator?
The original /avro-validator does the basics — type/name/fields are present, primitives are canonical, named-type references resolve. This page goes further: it enforces the union-no-duplicates and union-no-direct-nesting rules, namespace identifier syntax (com.example.foo good, 1bad.name bad), and it warns on default values that do not match the first branch of a union (per the Avro 1.11 spec). Use whichever fits — they overlap on the basics.
Why is "default does not match first union branch" only a warning, not an error?
The Avro 1.11 spec says the default applies to the first branch, but real-world libraries vary on how strict they are. Reference Apache Avro implementations enforce it; some third-party libraries silently coerce. We surface it as a warning so you see it, and you can decide.
Will it catch every Avro spec violation?
It catches the structural ones — the kind that crash a parser. Things like full namespace resolution across deeply nested records, alias collision detection, and exhaustive enum-symbol naming rules are partial. For an authoritative pass, run the Apache Avro reference library or your Schema Registry compatibility check.
What does "unknown type" mean?
Avro lets a record reference a previously declared named type (record, enum, fixed) by its name. If the validator sees a string type that is not one of the eight primitives (null, boolean, int, long, float, double, bytes, string) and was not declared earlier in the schema, it flags it as unknown. Usually a typo or a forward reference that needs to be inlined.
Are my schemas sent anywhere?
No. Validation runs entirely in your browser. The walker is plain TypeScript that visits each node of the parsed JSON object. Nothing leaves the page, no logging, no analytics on the schema content.
Can it validate a schema with logical types?
Yes. Logical types are written as a primitive plus a logicalType annotation, e.g. {"type":"long","logicalType":"timestamp-millis"}. The validator walks the structure normally. If the underlying type does not match what the spec recommends for that logical type (for example, logicalType: "decimal" on a string), you get a warning so you can decide.
Other Avro Tools
Validating is one piece of working with Avro. These tools cover the rest: