If you've been writing JavaScript for more than a few months, you've felt the pain of
deeply nested callbacks and tangled .then() chains. async/await, introduced
in ES2017,
fixed all of that — and yet developers still run into the same three or four traps
with it. Let's go through the whole thing properly: how it works, how to handle errors well,
and the parallel execution patterns that actually matter for performance.
The Basics — async Functions and await
An async function always returns a
Promise.
Inside it, await pauses execution until the awaited Promise settles. That's the whole mental model:
async function fetchUserProfile(userId) {
const response = await fetch(`https://api.example.com/users/${userId}`);
const profile = await response.json();
return profile; // wrapped in a Promise automatically
}
// Calling an async function gives you a Promise
const profilePromise = fetchUserProfile(42);
profilePromise.then(profile => console.log(profile.name));
// Or use await at the call site
const profile = await fetchUserProfile(42);
console.log(profile.name);The await keyword can only be used inside an async function —
or at the top level of an ES module (more on that later). Using it anywhere else is a syntax error.
Error Handling — The Right Way and the Wrong Way
This is where most tutorials go off the rails. The instinct is to wrap everything in try/catch and call it done. That works, but empty catch blocks are a code smell that masks real bugs:
// ❌ Don't do this — silent failure, impossible to debug
async function loadConfig() {
try {
const res = await fetch('/api/config');
return await res.json();
} catch (err) {
// swallowed — you'll never know what broke
}
}
// ✅ Do this — handle errors explicitly, return something meaningful
async function loadConfig() {
try {
const res = await fetch('/api/config');
if (!res.ok) {
throw new Error(`Config fetch failed: ${res.status} ${res.statusText}`);
}
return await res.json();
} catch (err) {
console.error('loadConfig error:', err.message);
return null; // caller can check for null
}
}I prefer a pattern where async functions either return null on failure or
throw intentional errors. What I avoid is catching an error, logging it, and then returning a
value that makes the caller think the request succeeded.
await to(promise) that returns
[error, data] tuples.Sequential vs Parallel — The Performance Trap
This is the mistake I see most often in production code. When you await
each call one after the other, you're running them sequentially — even when they're completely
independent of each other:
// ❌ Sequential — takes ~900ms total (300 + 300 + 300)
async function loadDashboard(userId) {
const user = await fetchUser(userId); // 300ms
const orders = await fetchOrders(userId); // 300ms
const settings = await fetchSettings(userId); // 300ms
return { user, orders, settings };
}
// ✅ Parallel with Promise.all — takes ~300ms total
async function loadDashboard(userId) {
const [user, orders, settings] = await Promise.all([
fetchUser(userId),
fetchOrders(userId),
fetchSettings(userId)
]);
return { user, orders, settings };
}Promise.all()
fires all three requests at the same moment and waits for
all of them to complete. If any one of them rejects, the whole thing rejects. That's usually what
you want for dashboard-style loading where all data is required.
Promise.allSettled — When Partial Failure Is OK
Sometimes you want to fire multiple requests and use whatever comes back successfully,
even if some fail. Promise.allSettled()
is built for exactly that:
async function loadWidgets(widgetIds) {
const results = await Promise.allSettled(
widgetIds.map(id => fetchWidget(id))
);
const widgets = [];
const errors = [];
for (const result of results) {
if (result.status === 'fulfilled') {
widgets.push(result.value);
} else {
errors.push(result.reason.message);
}
}
if (errors.length > 0) {
console.warn('Some widgets failed to load:', errors);
}
return widgets; // return whatever succeeded
}This pattern is great for non-critical UI elements — like a sidebar with multiple independent sections. If one fails, you show the rest instead of blanking the whole page.
async in Loops — The forEach Gotcha
This one has burned everyone at least once. Array.forEach() does not
await async callbacks — it fires them and immediately moves on. The loop finishes before any
of the async work is done:
const orderIds = [101, 102, 103, 104];
// ❌ forEach ignores async — all requests fire in parallel uncontrolled,
// and code after the forEach runs before any complete
orderIds.forEach(async (id) => {
await processOrder(id); // NOT awaited by forEach
});
console.log('done?'); // prints before any order is processed
// ✅ for...of — sequential, fully awaited
for (const id of orderIds) {
await processOrder(id);
}
console.log('done'); // prints after all orders are processed
// ✅ Parallel but controlled — all fire at once, await all completions
await Promise.all(orderIds.map(id => processOrder(id)));
console.log('done'); // prints after all orders are processedUse for...of when order matters or when you need to throttle requests
(process one at a time). Use Promise.all(map(...)) when you want maximum parallelism
and don't need sequential guarantees.
Top-Level await in ES Modules
Since ES2022, you can use await at the top level of an ES module — no
wrapper function needed. This is a big deal for module initialization that depends on async data:
// config.js (ES module)
const response = await fetch('/api/runtime-config');
const config = await response.json();
export const API_BASE_URL = config.apiBaseUrl;
export const FEATURE_FLAGS = config.featureFlags;// main.js — imports wait for config.js to fully resolve
import { API_BASE_URL, FEATURE_FLAGS } from './config.js';
console.log(API_BASE_URL); // guaranteed to be loadedTop-level await
works in Node.js 14.8+
with "type": "module" in package.json,
and in all modern browsers via native ES modules. The importing module's execution is paused until
the awaited module fully resolves — which is exactly the guarantee you need.
A Real Pipeline — Fetch, Parse, and Transform
Here's a realistic async pipeline that combines everything: fetching from an API, handling HTTP errors, transforming the data, and falling back gracefully on failure:
async function getProductCatalog(categoryId) {
// Step 1: fetch raw data
const response = await fetch(
`https://api.shop.example.com/categories/${categoryId}/products`,
{ headers: { Authorization: `Bearer ${getAuthToken()}` } }
);
if (!response.ok) {
throw new Error(`Catalog fetch failed: ${response.status}`);
}
// Step 2: parse JSON
const raw = await response.json();
// Step 3: transform into the shape your UI needs
const products = raw.items.map(item => ({
id: item.product_id,
name: item.display_name,
price: (item.price_cents / 100).toFixed(2),
inStock: item.inventory_count > 0,
imageUrl: item.media?.[0]?.url ?? '/images/placeholder.png'
}));
// Step 4: filter out anything that's been discontinued
return products.filter(p => !p.discontinued);
}
// Usage
try {
const catalog = await getProductCatalog('electronics');
renderProductGrid(catalog);
} catch (err) {
showErrorBanner(`Could not load products: ${err.message}`);
}Useful Tools
When you're debugging async code that processes JSON payloads, these tools help: JSON Formatter to inspect API responses, JSON Validator to catch malformed payloads before they hit your code, and JS Formatter to clean up async function code. For the full async/await spec, the MDN async guide is the most thorough reference, and TC39 spec covers the exact semantics if you need them.
Wrapping Up
async/await makes asynchronous JavaScript readable — but the traps
are real. Always check response.ok before parsing, never let catch blocks swallow
errors silently, use Promise.all() for independent parallel calls, and stay away
from forEach with async callbacks. Get those habits locked in and your async code
will be both fast and debuggable.