Protobuf to JSON Converter
Paste a .proto schema. Get a JSON sample that matches it, with proto3 defaults filled in.
Input (.proto schema)
Output (JSON sample)
What this tool does
You have a Protocol Buffers schema and you need a JSON example of what one of those messages looks like — for a test fixture, an OpenAPI example, a stubbed gRPC response, whatever. Hand-typing a JSON shape from a long .proto file is tedious and error-prone. This converter reads the schema, picks the last message declared (the usual "outer" type), and emits a JSON object that matches it field-for-field.
The output uses the official proto3 default values: empty string for string and bytes, 0 for numeric types, false for bool, [] for repeated, {} for map, and the zero-valued enum constant for enum fields. Nested messages are expanded recursively. 64-bit integers come out as JSON strings to match the proto3 JSON mapping spec — you cannot put int64 into a JS Number without losing precision.
Everything happens locally in your browser — no .proto upload, no schema sent to a server, no AI inference call. Just a hand-rolled parser that handles syntax/package/import directives, comments (line and block), nested message and enum blocks, oneof, map<K, V>, repeated, optional, and field options (which are read but ignored). If you need the full protobufjs-style runtime, that is a different problem — but for "give me a JSON shape from this schema", this is faster.
How to use it
Three steps. The output JSON is ready to paste into a fixture, an example block, or a request body straight away.
Paste your .proto schema
Drop the schema into the left editor. syntax = "proto3"; at the top is fine but optional — the parser does not care. Comments, package declarations, and import statements are all skipped cleanly. If you have multiple messages, the parser uses the last top-level message (the typical outer/composite type) as the root.
Want to convert a different message instead? Move the message you care about to the bottom of the file. The schema can include nested types, enums, and oneof blocks — they all resolve.
Hit Convert
Click the green Convert button. The parser tokenizes the schema, builds a message tree, and walks the root message emitting proto3 defaults for every field. Nested messages are expanded inline. repeated fields produce a one-element array as a hint about the element shape — empty arrays would not show the structure.
Use the JSON
Copy the result into your test fixture, your OpenAPI example block, your gRPC-Web mock, or wherever you need a request/response shape. The keys match the .proto field names exactly — switch to camelCase later if your codegen does that.
When this actually saves time
Stubbing gRPC responses for tests
Your service handler returns a Protobuf response. The unit test needs a JSON fixture that matches the message shape. Paste the <code>.proto</code>, grab the JSON, drop it into your fixtures folder. No more hand-typing 30 fields and missing one.
OpenAPI examples for a gRPC-gateway
Running grpc-gateway or similar to expose Protobuf services as REST? Each operation needs a JSON example. Convert each .proto message to a JSON skeleton and paste it under the example key in your spec.
Bootstrapping JSON Schema
You want to validate JSON requests that match a <code>.proto</code> contract. Convert to a JSON sample first, hand it to a JSON-Schema-from-sample tool, and you have a starting schema in seconds.
Filling out request bodies in API clients
Testing an HTTP-transcoded gRPC API in Postman or curl? Paste the .proto, copy the JSON skeleton into the request body, fill in the values you actually want to send.
Common questions
Is my .proto schema sent anywhere?
No. Parsing happens entirely in your browser via JavaScript. Nothing about your schema — message names, field names, package paths — leaves your machine. Open DevTools and check the Network tab while you click Convert. Zero requests.
Does it handle proto2 as well as proto3?
Mostly. The parser handles proto2 syntax tokens like required and optional, but the output values use proto3 defaults (which is what the proto3 JSON mapping specifies). If you have a proto2 file with explicit defaults set via [default = ...], those defaults are not applied to the output.
How does it pick which message to convert?
It uses the last top-level message declared in the file. In real schemas the outer/composite type is usually declared after its dependencies, so this lines up with what you want. If it picks the wrong one, reorder the file so the message you want is last.
Why are int64 values strings in the output?
Because JSON only has IEEE-754 doubles, which lose precision above 2^53. The official proto3 JSON mapping calls for int64, uint64, fixed64, sfixed64, sint64 to be encoded as JSON strings. We follow that convention.
What about oneof, map, and repeated?
All three work. oneof fields parse as regular fields (the JSON object will include all of them rather than picking one — you typically delete the ones you do not want). map<K, V> emits an empty {} object. repeated emits a single-element array showing the element shape — you can duplicate or delete to match your real data.
Does it follow imports?
No. import statements are recognised and skipped. Cross-file message types resolve to null in the output. If you need cross-file resolution, paste the relevant messages from the imported files into the same input.
How big a schema can it handle?
Tens of thousands of lines, no problem. Everything is local, so there is no upload, no rate limit, no network latency.
Related tools
If you are wrangling Protobuf, JSON, and schemas, these pair well: