Python and JSON are a natural pair.
Whether you're building a REST API with FastAPI
or Django,
processing data pipelines, or just reading a config file, you'll work with JSON constantly.
The good news: Python's standard library has everything you need in the
json module.
No pip install required.
The Four Functions You Actually Use
The json module gives you four functions for day-to-day work:
json.loads(str)— parse a JSON string into a Python objectjson.dumps(obj)— convert a Python object into a JSON stringjson.load(file)— parse JSON directly from a file objectjson.dump(obj, file)— write a Python object as JSON to a file
The s in loads / dumps stands for string.
The ones without the s work with file objects. Easy to remember once you know the rule.
json.loads() — Parsing a JSON String
import json
json_string = '{"name": "Alice", "age": 30, "active": true, "score": 98.5}'
user = json.loads(json_string)
print(user["name"]) # Alice
print(user["age"]) # 30
print(user["active"]) # True
print(type(user)) # <class 'dict'>Notice the type mapping: JSON true becomes Python True,
JSON false becomes Python False, JSON null becomes Python None.
JSON objects become Python dict, JSON arrays become Python list.
json.dumps() — Serialising to a JSON String
import json
user = {
"name": "Bob",
"age": 25,
"roles": ["admin", "editor"],
"active": True,
"extra": None
}
# Compact (good for network transmission)
compact = json.dumps(user)
print(compact)
# {"name": "Bob", "age": 25, "roles": ["admin", "editor"], "active": true, "extra": null}
# Pretty-printed (good for logs and human inspection)
pretty = json.dumps(user, indent=2)
print(pretty)
# {
# "name": "Bob",
# "age": 25,
# "roles": [
# "admin",
# "editor"
# ],
# "active": true,
# "extra": null
# }Notice the reverse type mapping: Python True → JSON true,
Python None → JSON null. Python handles this automatically.
Reading JSON from a File
This is probably the most common use case — reading a config file or data file at startup:
import json
# Read and parse in one step
with open("config.json", "r", encoding="utf-8") as f:
config = json.load(f)
print(config["database"]["host"]) # localhost
print(config["database"]["port"]) # 5432Always specify encoding="utf-8" when opening JSON files. JSON is specified as UTF-8
by RFC 8259,
and omitting it can cause issues on Windows where the default encoding is sometimes cp1252.
Writing JSON to a File
import json
results = {
"timestamp": "2024-01-15T09:30:00Z",
"total": 1523,
"processed": 1521,
"failed": 2,
"errors": [
{"id": 42, "reason": "missing field"},
{"id": 99, "reason": "invalid format"}
]
}
with open("results.json", "w", encoding="utf-8") as f:
json.dump(results, f, indent=2)
print("Results saved to results.json")Handling Errors Properly
json.loads() raises
json.JSONDecodeError
(a subclass of ValueError) when the input isn't valid JSON. Always handle it when parsing
data you don't control:
import json
def safe_parse(json_str):
try:
return json.loads(json_str)
except json.JSONDecodeError as e:
print(f"Invalid JSON at line {e.lineno}, column {e.colno}: {e.msg}")
return None
data = safe_parse('{"name": "Alice"}') # works fine
bad = safe_parse('not json at all') # prints error, returns None
also_bad = safe_parse('{"key": }') # prints error with position infoJSONDecodeError gives you the exact line and column where the parse failed,
which is useful when debugging large JSON files.
Useful dumps() Options
import json
data = {
"z_key": 1,
"a_key": 2,
"price": 9.999999999
}
# Sort keys alphabetically (great for reproducible output / diffs)
print(json.dumps(data, sort_keys=True, indent=2))
# {
# "a_key": 2,
# "price": 9.999999999,
# "z_key": 1
# }
# Ensure non-ASCII characters are preserved (default: escaped to \uXXXX)
data2 = {"city": "Münich", "greeting": "こんにちは"}
print(json.dumps(data2, ensure_ascii=False))
# {"city": "Münich", "greeting": "こんにちは"}
# With ensure_ascii=True (default):
print(json.dumps(data2))
# {"city": "M\u00fcnich", "greeting": "\u3053\u3093\u306b\u3061\u306f"}ensure_ascii=False is something I always add when writing JSON files that
contain non-ASCII text. The escaped version is technically valid JSON but much harder to read in a text editor.
Serialising Custom Objects
By default, json.dumps() can't serialise custom class instances or
datetime
objects. You have two options: subclass
json.JSONEncoder,
or convert to a dict first:
import json
from datetime import datetime, date
# Option 1: custom encoder class
class AppEncoder(json.JSONEncoder):
def default(self, obj):
if isinstance(obj, (datetime, date)):
return obj.isoformat()
return super().default(obj)
data = {"name": "Alice", "created_at": datetime(2024, 1, 15, 9, 30)}
print(json.dumps(data, cls=AppEncoder, indent=2))
# {
# "name": "Alice",
# "created_at": "2024-01-15T09:30:00"
# }
# Option 2: default= parameter (simpler for one-off conversions)
print(json.dumps(data, default=str, indent=2)) # converts anything unknown to strA Practical Pattern: Config File Loading
Here's a real-world pattern I use in almost every Python project — a config loader that reads a JSON config file with sensible defaults:
import json
import os
from pathlib import Path
DEFAULTS = {
"database": {"host": "localhost", "port": 5432},
"debug": False,
"log_level": "INFO"
}
def load_config(path="config.json"):
config = DEFAULTS.copy()
config_path = Path(path)
if config_path.exists():
with open(config_path, "r", encoding="utf-8") as f:
try:
user_config = json.load(f)
# Deep merge: user settings override defaults
for key, value in user_config.items():
if isinstance(value, dict) and key in config:
config[key].update(value)
else:
config[key] = value
except json.JSONDecodeError as e:
print(f"Warning: config.json is invalid ({e.msg}), using defaults")
return config
config = load_config()
print(config["database"]["host"]) # localhost (or overridden value)Wrapping Up
Python's json module covers everything you need without any dependencies.
The key rules: use loads()/dumps() for strings, load()/dump()
for files, always handle JSONDecodeError when parsing external data, and add
ensure_ascii=False when your data contains non-Latin characters.
For debugging JSON data, the JSON Formatter and
JSON Validator can save you a lot of time.