JSON to Protobuf Converter
Paste a JSON object. Get a proto3 schema with types inferred and nested messages laid out for you.
Input (JSON)
Output (.proto schema)
What this tool does
Got a real JSON payload — a sample API response, a webhook body, a row from a NoSQL store — and want to model it as a Protocol Buffers message? Hand-typing the schema is slow and easy to get wrong, especially when there are nested objects and arrays. This converter walks the JSON, infers Protobuf types for every field, and emits a clean proto3 schema you can drop into your project.
Type inference follows what you would write yourself: string for strings, bool for booleans, int32 for integers that fit in 32-bit, int64 for the rest, double for non-integer numbers, repeated <T> for arrays of a uniform element type (one nested message reused for arrays of objects), and nested message blocks for nested objects. JSON cannot tell a struct from a map, so all objects render as nested messages — swap one for map<K, V> by hand if your data is genuinely a map.
Field names are converted from camelCase or kebab-case to the Protobuf-conventional snake_case. Field numbers are assigned 1, 2, 3, … in declaration order. The output is valid proto3 — paste it into a .proto file, run protoc or buf, and you have generated code in your language of choice. Conversion runs entirely in your browser — no JSON, no field names, no values are sent anywhere.
How to use it
Three steps. Works with any well-formed JSON object — API responses, log entries, fixture files, whatever.
Paste your JSON
Drop the JSON into the left editor. The root must be an object ({ ... }) — wrap an array in an object first if your data is array-rooted, e.g. { "items": [...] }. Use realistic data: the more representative the sample, the better the inferred types match what you want long-term.
If your JSON has unquoted keys, trailing commas, or other quirks, parse it through the JSON Fixer first — Protobuf needs a clean object to work from.
Hit Convert
Click the green Convert button. The converter walks every key in the JSON, picks a Protobuf type, builds nested message blocks for nested objects, and emits the schema with syntax = "proto3"; at the top. Field numbers are assigned in source order.
Use the .proto
Copy the schema into a .proto file in your repo. Review the inferred types — for fields where the JSON sample was empty (empty array, null) you will see a comment flagging that the type was guessed. Adjust as needed, then run protoc or buf generate to produce code in your language.
When this actually saves time
Modelling a third-party API as Protobuf
A vendor returns JSON. Your service stores Protobuf. Take a real response, paste it here, get a starting schema for the type, then refine. Beats reading the docs and typing 50 fields by hand.
Migrating a JSON-based service to gRPC
You are moving an HTTP+JSON microservice to gRPC. Each request and response shape needs a .proto. Convert each captured payload to a Protobuf schema, paste them into a single file, and you have the contract sketched out.
Bootstrapping a Buf module
Setting up a new Buf module and need realistic schemas to start with? Convert your existing JSON fixtures and use the output as the seed for your .proto files — much faster than typing them from scratch.
Writing test fixtures for Protobuf code
Your team has JSON test data. The new code consumes Protobuf. Generate the <code>.proto</code> from the JSON, then have your codegen build types — your fixtures and your code stay in sync.
Common questions
Is my JSON sent anywhere?
No. The converter runs entirely in your browser as JavaScript. Your JSON — keys, values, anything sensitive — never leaves your machine. Open DevTools and check the Network tab while you click Convert. Zero requests.
How does it pick int32 vs int64 vs double?
For integer values, it checks if the value fits in a signed 32-bit range (-2^31 to 2^31-1). If yes, int32. If not, int64. Non-integer numbers always become double. If you know your data is unsigned or you want a specific width like fixed32, edit the output — see the scalar type table for all available numeric types and their wire-encoding tradeoffs.
When does an object become a map vs a nested message?
Always a nested message — never a map. JSON does not distinguish a struct from a map, so guessing one or the other is wrong about half the time. If your data is actually a key-value map (e.g. metadata, headers, feature flags), open the output and swap message Foo { ... } for map<string, V> foo = N; by hand. The fix is mechanical and obvious once you see your data.
What about null and empty arrays?
Both produce a comment in the output flagging that the type was guessed from a degenerate sample. null defaults to string with a "nullable" note. Empty arrays default to repeated string with an "empty array" note. Replace those types with whatever you actually expect.
Why are arrays of mixed types rendered as repeated string?
Protobuf does not support heterogeneous lists directly. If your JSON array has mixed types (some strings, some numbers), there is no clean Protobuf equivalent — you need either google.protobuf.Value, a oneof, or to refactor the data shape. The converter flags it with a comment so you can decide.
Does it handle deeply nested JSON?
Yes. Each nested object becomes a nested message with a derived PascalCase name. Nesting depth is bounded only by stack depth, not by the converter — even highly nested API responses convert cleanly.
Can I round-trip JSON ↔ Protobuf with these two tools?
Mostly. JSON to Protobuf gives you a schema; Protobuf to JSON gives you a sample. The shapes match for fields where the JSON sample had a representative value. Where the JSON had null or empty arrays, the inferred Protobuf type is a guess and the round-trip will only be exact after you fix the type.
Related tools
If you are wrangling JSON and schemas, these pair well: