The first time I shipped a JSON API on
Cloudflare Workers,
I deployed it from my laptop at 11pm and had it running in 300+ data centers before I finished my tea.
No Dockerfile, no Kubernetes cluster, no cold-start drama. A single wrangler deploy
and a 1.2 KB bundle. That experience is why Workers has become my default for JSON-in, JSON-out services —
webhooks, proxies, API aggregators, edge auth. If 80% of your backend is "parse JSON, do a thing, return JSON",
this is the article I wish I'd had when I started.
A Cloudflare Worker is basically a single JavaScript (or TypeScript) function that runs on V8 isolates at Cloudflare's edge. It receives a Request, returns a Response, and has access to the standard Fetch API. If you've used fetch() in a browser, you already know 90% of the runtime. What you don't know is the small set of patterns that separate a toy Worker from one you can actually run in production. That's what this article is about.
Your First JSON Endpoint
Here's the smallest useful JSON endpoint. It returns a single object with a timestamp and a
message. Save it as src/index.ts in a Wrangler project:
export default {
async fetch(request, env, ctx) {
const payload = {
message: 'Hello from the edge',
servedAt: new Date().toISOString(),
colo: request.cf?.colo ?? 'unknown',
};
return Response.json(payload);
},
};Two things to notice. First, Response.json() is a static helper that serializes
the object and sets Content-Type: application/json for you. Don't roll your own
new Response(JSON.stringify(x)) unless you need a custom content type — you'll just
forget the header eventually. Second, request.cf.colo tells you which Cloudflare
data centre is serving the request. A request from Berlin will show FRA,
from Tokyo it'll show NRT. That's the whole "edge" pitch in one field.
Parsing a JSON Request Body
POST endpoints need to read a body. The Fetch API gives you request.json(), which reads the body stream and parses it in one call:
export default {
async fetch(request) {
if (request.method !== 'POST') {
return new Response('Method not allowed', { status: 405 });
}
const body = await request.json();
// body is now a regular JavaScript object
const { email, plan } = body;
return Response.json({
received: { email, plan },
ok: true,
});
},
};Looks clean, but this code has a bug you'll hit within 24 hours of shipping it:
if the client sends an empty body or malformed JSON, request.json() throws
a SyntaxError, your Worker crashes, and Cloudflare returns a generic 500.
That's not the response you want in front of customers.
Handling Malformed JSON — Don't Let It 500
Always wrap body parsing in a try/catch and return a proper 400. Here's the pattern I use in every Worker:
async function readJson(request) {
try {
return { ok: true, data: await request.json() };
} catch (err) {
return {
ok: false,
error: 'Invalid JSON body',
detail: err.message,
};
}
}
export default {
async fetch(request) {
const result = await readJson(request);
if (!result.ok) {
return Response.json(result, { status: 400 });
}
const { email, plan } = result.data;
if (!email || !plan) {
return Response.json(
{ ok: false, error: 'email and plan are required' },
{ status: 422 },
);
}
return Response.json({ ok: true, email, plan });
},
};CORS for JSON APIs
If your Worker will be called from a browser on a different origin — which is
the normal case — you need
CORS
headers. Browsers send an OPTIONS preflight before the real request for anything
more than a simple GET. Handle both in one place:
const CORS_HEADERS = {
'Access-Control-Allow-Origin': 'https://app.example.com',
'Access-Control-Allow-Methods': 'GET, POST, OPTIONS',
'Access-Control-Allow-Headers': 'Content-Type, Authorization',
'Access-Control-Max-Age': '86400',
};
export default {
async fetch(request) {
if (request.method === 'OPTIONS') {
return new Response(null, { status: 204, headers: CORS_HEADERS });
}
const data = { ping: 'pong', at: Date.now() };
return Response.json(data, { headers: CORS_HEADERS });
},
};Avoid Access-Control-Allow-Origin: * on anything that reads credentials
or returns user data. It's one of those shortcuts that looks harmless in dev and turns into
a security incident in prod. Hard-code the origins you actually serve, or read them from an
allow-list in env.
Forwarding JSON to an Upstream API
One of the most common uses of a Worker is as a thin proxy: hide an API key, reshape a response, strip fields a client doesn't need, or stitch two upstream calls into one. Here's a Worker that calls an upstream service, picks out just the fields we care about, and returns a cleaner JSON payload:
export default {
async fetch(request, env) {
const url = new URL(request.url);
const userId = url.searchParams.get('id');
if (!userId) {
return Response.json({ error: 'id required' }, { status: 400 });
}
const upstream = await fetch(
`https://api.internal.example.com/users/${userId}`,
{
headers: { 'Authorization': `Bearer ${env.UPSTREAM_TOKEN}` },
},
);
if (!upstream.ok) {
return Response.json(
{ error: 'upstream failed', status: upstream.status },
{ status: 502 },
);
}
const full = await upstream.json();
// Strip internal fields before returning to the client
const safe = {
id: full.id,
displayName: full.display_name,
avatarUrl: full.avatar_url,
joinedAt: full.created_at,
};
return Response.json(safe);
},
};Two things to pay attention to. First, always check upstream.ok
before calling .json() — a 500 from upstream will have HTML or an error page,
and calling .json() on it throws the same way as any other malformed JSON.
Second, keep secrets like UPSTREAM_TOKEN in Wrangler secrets
(wrangler secret put UPSTREAM_TOKEN) — never in wrangler.toml
and never committed to git.
Caching JSON Responses at the Edge
When an upstream is slow or expensive, the Cache API lets you memoize JSON at the edge. Each data centre keeps its own cache, so the first user in Frankfurt pays the upstream cost, and the next 10,000 get it in under 5ms from RAM nearby:
export default {
async fetch(request, env, ctx) {
const cache = caches.default;
const cacheKey = new Request(request.url, request);
let response = await cache.match(cacheKey);
if (response) {
return response;
}
const upstream = await fetch('https://api.example.com/popular-items');
const data = await upstream.json();
response = Response.json(data, {
headers: {
'Cache-Control': 'public, max-age=60',
},
});
// Don't block the response on the cache write
ctx.waitUntil(cache.put(cacheKey, response.clone()));
return response;
},
};The ctx.waitUntil() is the non-obvious part. Without it, the
cache.put() is awaited and your response waits on a disk write it doesn't
need to care about. waitUntil lets you return the response immediately
while the runtime keeps the cache write alive in the background. It's the same pattern
you'd use for analytics beacons, log forwarding, anything fire-and-forget.
Local Development with Wrangler
You don't need a Cloudflare account to iterate. Install Wrangler, scaffold a project, and you get a local Workers runtime that matches production closely:
npm create cloudflare@latest my-json-api
cd my-json-api
npm run dev
# Worker is now live at http://localhost:8787
# Hit it from another terminal:
curl -X POST http://localhost:8787 \
-H "Content-Type: application/json" \
-d '{"email":"[email protected]","plan":"pro"}'The local runtime uses workerd, the same engine Cloudflare runs in production.
The behaviours that differ (KV latency, cache semantics, request.cf fields)
are well-documented and rarely bite you for simple JSON APIs. Deploy with
wrangler deploy and the same code is live globally in seconds.
Useful Tools for Building Worker JSON APIs
A few tools I reach for constantly when building Workers that deal with JSON: JSON Formatter to pretty-print an ugly upstream response I'm trying to reverse-engineer, JSON Validator when a POST payload fails and I need to know exactly where, JSON Path to plan the field-picking logic before I write it, and JSON Minifier when I want to check whether the wire size actually matters for a given endpoint.
The JSON format itself is specified in
RFC 8259
— worth a skim if you ever hit an edge case like "does my parser allow NaN?"
(answer: it shouldn't). Cloudflare's own
Workers examples gallery
has recipes for JWT verification, A/B testing, HTML rewriting, and a dozen other patterns
once you've outgrown the basics in this article.
Wrapping Up
Cloudflare Workers is a very good fit for JSON APIs — small, fast, globally
distributed, and cheap enough that you can leave side projects running. The happy path is
just request.json() and Response.json(), but the production path
involves four extra habits: wrap body parsing in try/catch, add CORS headers intentionally,
check upstream.ok before parsing proxied responses, and use
ctx.waitUntil for cache writes and other background work. Get those four right
and you'll ship Workers that stay up.