Protobuf to C# Converter
Paste a .proto schema. Get C# classes with auto-properties — drop them straight into a project, no protoc plugin required.
Input (.proto schema)
Output (C#)
What this tool does
You have a Protocol Buffers schema and a C# service or client that needs the matching DTOs. The official path is to install protoc with the C# plugin (or wire Grpc.Tools into your .csproj) and let MSBuild generate the partial classes during build. That works, but it is overkill when you just want to read the schema, sketch types, or paste a message into a Razor page or a one-off integration. This converter does the same shape of work — paste, copy, drop into Types.cs.
Type mapping is the boring-but-correct kind. string stays string (with = "" initialiser so nullable-reference projects do not complain). bool, int32, int64, uint32, uint64, float, double map to bool, int, long, uint, ulong, float, double respectively. bytes becomes byte[]. repeated T becomes List<T>, map<K, V> becomes Dictionary<K, V>, and well-known wrappers like google.protobuf.Timestamp are emitted as string (the proto3 JSON encoding for timestamps is RFC 3339, which is just a string at the wire level — see the proto3 JSON spec if you need the details).
Field names get the standard PascalCase treatment — order_id → OrderId, shipping_address → ShippingAddress — matching what the official C# protobuf reference generates. Enum values lose the SCREAMING_SNAKE prefix when the schema follows the convention of prefixing each value with the enum name (so ORDER_STATUS_PENDING in enum OrderStatus becomes OrderStatus.Pending). Output classes are flat — every nested message is hoisted to top level so you can split or rearrange them without untangling scopes. The conversion happens entirely in your browser; nothing about the schema is uploaded.
How to use it
Three steps. The output is ready to compile — paste it into a <code>Types.cs</code> file in your project.
Paste your .proto schema
Drop the schema into the left editor. syntax = "proto3"; at the top is optional. The parser handles nested message blocks, enum declarations, oneof, map<K, V>, field options, and the usual package/import/option directives. Imports are recognised but skipped, so paste any imported types inline if your schema depends on them.
Field name conversion is automatic: customer_name in .proto becomes CustomerName in C#. Class and enum names stay as-is (already PascalCase by convention).
Read the output
On the right: public class declarations with { get; set; } auto-properties for each message, plus public enum declarations for each enum. A using System.Collections.Generic; line is added when the schema uses repeated or map fields (so List and Dictionary resolve). Drop the file into a project, wrap in your namespace, you are done.
Wire it up
For pure DTO use, the classes are ready to serialise with System.Text.Json or Newtonsoft.Json. For real gRPC C# work — service implementations, streaming, deadlines — keep using Grpc.Tools for the wire-format types, and use this converter for hand-written wrappers, mapping layers, or test fixtures alongside the generated code.
When this actually saves time
Sketching DTOs for a new gRPC service
You are starting an ASP.NET Core gRPC service and want to read what the messages will look like in C# before committing to Grpc.Tools. Paste the .proto, eyeball the classes, decide if the field naming matches your team's style, then wire up codegen properly.
Hand-written mapping layer
Your generated gRPC types live in their own namespace and you want plain DTOs for your domain layer. The output here gives you clean classes without the metadata clutter of generated code — easy to map into and out of with AutoMapper or hand-rolled converters.
Reviewing a Protobuf schema change
A teammate added fields to a message in a PR. You want to see the C# shape implications without checking out the branch and running a build. Paste the new schema, diff against the current C# types, leave a focused review comment.
Test fixtures and quick scripts
You are writing a one-off LinqPad script or a console app that POSTs to a gRPC-gateway. Wiring up the full Protobuf toolchain for 50 lines of test code is overkill. Grab the classes from here, JSON-serialise them, send the request, move on.
Common questions
Is my schema sent anywhere?
No. The parser and C# emitter run entirely in your browser as JavaScript. Open DevTools and watch the Network tab while you paste — zero requests. Useful when your schema includes internal type names, package paths, or anything you would not want to ship to a third-party service.
Do these classes work with Grpc.Tools generated code?
They produce equivalent shapes, but they are not byte-for-byte identical. Grpc.Tools generates partial classes, parser registrations, descriptor wiring, and ancestor types from Google.Protobuf.IMessage — none of that is here. For real gRPC wire protocol work, use Grpc.Tools. For DTO-only code (e.g. JSON-over-HTTP gateways, mapping layers, test data), this output is fine.
Why are int64 and uint64 typed as long and ulong instead of string?
Because in C# they fit. Unlike JavaScript Numbers (which lose precision above 2^53), C# long handles the full int64 range natively, so there is no reason to fall back to string. If you are deserialising proto3 JSON where 64-bit ints arrive as strings (per the JSON mapping spec), System.Text.Json with JsonNumberHandling.AllowReadingFromString handles the conversion.
How does it handle the SCREAMING_SNAKE enum value convention?
Protobuf style guide recommends prefixing every enum value with the enum name in SCREAMING_SNAKE form. The converter detects this and strips it. ORDER_STATUS_UNSPECIFIED in enum OrderStatus becomes OrderStatus.Unspecified. If your enum does not follow that convention (some values prefixed, some not), the prefix-strip is skipped and values are PascalCased as-is.
Does it handle oneof?
Each oneof field is emitted as a regular auto-property. The output does not enforce the "exactly one" constraint that oneof implies — the C# language does not have native discriminated unions yet. If you need stricter modelling, look at libraries like OneOf on NuGet, or hand-edit the output to use a base class plus subclasses.
What about google.protobuf.Timestamp and well-known types?
Timestamp is emitted as string (proto3 JSON encodes timestamps as RFC 3339 strings). Empty and Any become object placeholders — adjust to Google.Protobuf.WellKnownTypes.Any or a stronger type if your project pulls in the WKT package. The output is intentionally framework-free so you can drop it into any C# project without forcing a NuGet dependency.
Related tools
If you are juggling Protobuf, JSON, and C#, these pair well: