Input (.proto schema)

Output (TypeScript)

What this tool does

You have a Protocol Buffers schema and a TypeScript frontend that needs to talk to a gRPC or HTTP-transcoded backend serving those messages. The official codegen toolchain (protoc with ts-proto or protobuf-ts) requires installing tools, configuring plugins, and wiring up a build step. This converter does the same job in your browser — paste, copy the output, drop it into your project.

Type mapping follows what hand-rolled types look like in real codebases: string/bytesstring/Uint8Array, boolboolean, the smaller numeric types (int32, uint32, float, double) → number, and the 64-bit integer types (int64, uint64, fixed64, sfixed64, sint64) → string to match the proto3 JSON mapping spec. repeated T becomes T[], map<K, V> becomes Record<K, V>, nested messages become nested interface declarations.

Field names are converted from snake_case (Protobuf convention) to camelCase (JavaScript/TypeScript convention) — matching the proto3 JSON encoder's default behaviour. Enums become string-literal union types (type OrderStatus = 'ORDER_STATUS_UNSPECIFIED' | 'ORDER_STATUS_PENDING' | ...), which is what most TypeScript codebases actually want — no need for the runtime overhead of an enum. The converter runs entirely in your browser; nothing about your schema leaves the page.

How to use it

Three steps. Output is ready to paste into a <code>.ts</code> file in seconds.

1

Paste your .proto schema

Drop the schema into the left editor. syntax = "proto3"; at the top is fine but optional. The parser handles nested message blocks, enum declarations, oneof, map<K, V>, and field options. Imports are recognised but skipped — paste imported types inline if you need them.

Field name conversion is automatic: order_id in .proto becomes orderId in TypeScript. Message and enum names stay as-is (already PascalCase).

2

Read the output

On the right: TypeScript with export interface for each message and export type with a string-literal union for each enum. Nested types come before their parent so the file is in declaration order. Add the file to your project and import the interfaces from your gRPC client or fetch handler.

3

Use the types

Wire the interfaces to your fetch / gRPC-Web / Connect-RPC client. The shape matches the proto3 JSON encoding, so JSON responses parse straight into the typed shape without manual conversion. Adjust int64 handling if your server uses non-standard JSON encoding.

When this actually saves time

Sketching types for a new gRPC frontend

You are building a new TS app on top of an existing gRPC service. You do not need full codegen yet — just the interface shapes for typing your fetch calls. Paste the .proto, drop the output in types.ts, you are typed.

Reviewing a Protobuf API change

A backend teammate added fields to a message. You want to see how that affects the frontend types without running the build. Paste the new .proto, diff the TypeScript output against your current types, leave a focused review comment.

Cross-checking generated types

Your build uses protobuf-ts or ts-proto, which produce types with their own conventions. Paste the schema here for a clean reference of what plain TS interfaces look like, useful for documentation or migration planning.

Throwaway scripts and one-off integrations

You are writing a quick Node script that POSTs JSON to a gRPC-gateway. Setting up the full Protobuf toolchain for a 30-line script is overkill. Grab the interfaces from here and you have type safety without the ceremony.

Common questions

Is my schema sent anywhere?

No. The parser and TS emitter run entirely in your browser as JavaScript. Open DevTools and watch the Network tab while you paste — zero requests. Useful when your schema includes internal type names, package paths, or anything you would not want to ship to a third-party service.

Why are int64 fields typed as string?

JavaScript Numbers are IEEE-754 doubles, which lose precision above 2^53. The official proto3 JSON mapping requires int64, uint64, fixed64, sfixed64, and sint64 to be encoded as JSON strings. So the TS interface uses string for those — matching what your server actually sends. If you need bigint instead, find-replace in the output.

Why are enums string unions instead of TS enums?

Most TypeScript projects prefer string-literal unions over TS enums these days — no runtime cost, better tree-shaking, and they match the proto3 JSON encoding (which serialises enums as their string name). If you want a const enum or a numeric enum, the conversion from the union is mechanical.

How does it handle map<K, V>?

Renders as Record<K, V>. Protobuf maps with non-string keys (e.g. map<int32, string>) become Record<number, string>. JSON only has string keys, so at runtime the keys will be strings even if the proto says int — this is a quirk of the proto3 JSON spec, not the converter.

Are fields marked optional?

No. proto3 fields are always present in JSON output (with defaults — empty string, 0, false, [], {}), so the TypeScript interface marks every field as required. If you actually want optional fields (because your runtime might omit them), add ? manually after each field name.

Does it handle oneof?

Each oneof field is emitted as a regular interface field. The output does not enforce the "exactly one" constraint that oneof implies — for that you would need a discriminated union, which depends on your runtime semantics. Edit the output by hand if you need stricter types.

Related tools

If you are working with Protobuf, JSON, and TypeScript, these pair well: