Input (.proto schema)

Output (Go)

What this tool does

Go is the protobuf-native language — most gRPC services in production are written in it. Normally you generate Go from .proto using protoc-gen-go or buf, which means installing the toolchain, configuring a generator, and running a build step. This converter does the same job in your browser — paste, copy, drop into your repo.

Type mapping follows what protoc-gen-go emits: stringstring, boolbool, bytes[]byte, the integer types map to their int32/int64/uint32/uint64 equivalents (no precision loss like in JavaScript), doublefloat64, floatfloat32. Singular message fields are pointers (matching the official Go protobuf bindings convention), repeated T becomes []T, map<K, V> becomes map[K]V.

Field names get the Go-canonical PascalCase treatment, with common acronyms uppercased (order_idOrderID, api_urlAPIURL) per Go review style. Each field gets paste-ready struct tags: protobuf:"varint,3,opt,name=status,proto3" for the wire format and json:"status,omitempty" for JSON marshalling. Conversion is local — your .proto never leaves the browser. For production code you should still run real codegen so you get methods, descriptors, and reflection plumbing — but for sketches, reviews, and one-off scripts this is faster.

How to use it

Three steps. Output is paste-ready Go that compiles as-is with the standard library.

1

Paste your .proto schema

Drop the schema into the left editor. syntax = "proto3"; at the top is fine but optional. The parser handles nested message blocks, enum declarations, oneof, map<K, V>, and field options. Imports are recognised but skipped.

Field names auto-convert from snake_case to PascalCase. The converter uppercases common acronym suffixes (IdID, UrlURL) so the output passes revive / golint without complaints.

2

Read the output

On the right: Go with one type X struct per message and type X int32 + const block per enum. package proto at the top is a placeholder — change it to your actual package name.

3

Wire it into your project

Drop the file into your project, fix the package declaration, and import. For real gRPC code you will still want to run protoc-gen-go to get the marshal/unmarshal methods. This output is intended for typed JSON handling, struct sketches, and reviews — the protobuf wire-format methods are not generated here.

When this actually saves time

Sketching a Go service from an existing .proto

You are scaffolding a Go service that consumes Protobuf messages from another team. You want the struct shapes for handler signatures and JSON responses without setting up the full codegen pipeline. Paste, drop into types.go, you are typed.

Reviewing a Protobuf API change for a Go consumer

A backend teammate added fields to a message. Paste the new .proto, diff the Go output against your current types.go, leave a focused review. Faster than spinning up the toolchain just to look at the change.

Cross-language sanity check

You have .proto consumed by both Go and TypeScript clients. Use this side-by-side with the Protobuf to TypeScript converter to confirm both languages will see compatible field names and types after JSON encoding.

One-off integration scripts

You are writing a 50-line Go script that hits a gRPC-gateway endpoint. Setting up protoc, buf, and a generator config for one script is overkill. Generate the structs here, drop them in, ship the script.

Common questions

Is this a replacement for protoc-gen-go?

No. protoc-gen-go emits the binary marshal/unmarshal methods, file descriptors, and the reflection plumbing needed for real gRPC. This converter only emits the struct shapes and tags. If you are writing a real gRPC service, run the official codegen. If you just need types for JSON responses, hand-rolled scripts, or sketches — this is faster.

Why are message-typed fields pointers?

It mirrors what protoc-gen-go does for proto3 message fields — they are pointers so the zero value is nil (distinguishable from a present-but-empty message). Scalar fields stay as values because their zero values are themselves valid (empty string, 0, false). If you prefer non-pointer for some reason, find-replace the asterisks in the output.

How are enums emitted?

As Go int32 typedefs with a const block, matching protoc-gen-go conventions. Each enum value becomes a PascalCase Go constant of that type. The numeric assignments come straight from the .proto.

What about the protobuf struct tags?

Each field gets a protobuf:"..." tag with the wire type (varint, fixed32, fixed64, bytes), field number, label (opt/rep), name, and proto3 marker. Plus a json:"name,omitempty" tag using the original snake_case name. map<K, V> tags are simplified — for strict wire-format compatibility, run real codegen.

How are field names converted?

snake_casePascalCase, with common acronym suffixes uppercased: order_idOrderID, api_urlAPIURL, data_jsonDataJSON. This matches Go review-style conventions so the output passes lint without manual cleanup.

What if my schema imports another .proto?

import statements are recognised and skipped — cross-file message types render with their leaf name (foo.BarBar) which Go will not resolve unless that type also exists in the same package. Either paste the imported messages inline, or expect to fix the references in the output.

Related tools

If you are working with Protobuf, JSON, and Go, these pair well: