Lightweight Django middleware for APM-style request profiling
Measure DB vs App time and query count with near-zero configuration.
- 🔍 DevTools visibility: See DB vs app/serialization time in Chrome DevTools via
Server-Timing. - 🚀 Zero-agent: No daemon, no SaaS — just one Django middleware.
- 🧩 Drop-in: Near-zero configuration (add middleware and go).
- 🔒 Privacy-first: Exposes timing + query counts only (no query contents stored).
Goal: make performance bottlenecks “visible” (DB vs app/serialization) without heavyweight APM.
Here's how django-xbench exposes request timing breakdown using the Server-Timing header:
Adds Server-Timing and X-Bench-Queries headers and optionally logs per-request metrics.
- ✅ Measures total request time and DB time (via
connection.execute_wrapper) - ✅ Calculates app time (= total - db)
- ✅ Counts DB queries
- ✅ Adds response headers:
Server-Timing: xbench-total;dur=..., xbench-db;dur=..., xbench-app;dur=...X-Bench-Queries: <int>
- ✅ Optional logging:
[XBENCH] GET /path | xbench_total=...ms xbench_db=...ms xbench_app=...ms q=...
- ✅ Slow endpoint aggregation (in-memory, per process) + simple dashboard (experimental)
- ✅ Tested with
pytest+pytest-django
pip install django-xbenchFor local development (this repository):
- Add middleware in your
settings.py:
MIDDLEWARE = [
# Recommended: place near the top to approximate end-to-end server time
# (includes other middleware overhead).
"django_xbench.middleware.XBenchMiddleware",
# ... other middleware ...
]- Run your server and hit any endpoint:
In your project:
python manage.py runserver
curl -I http://127.0.0.1:8000/<your-endpoint>/In this repo (demo):
# macOS / Linux
export DJANGO_SECRET_KEY="dev"
python -m examples.manage runserver --noreload
curl -I http://127.0.0.1:8000/db-heavy/# Windows PowerShell
$env:DJANGO_SECRET_KEY="dev"
python -m examples.manage runserver --noreload
curl -I http://127.0.0.1:8000/db-heavy/You should see headers similar to:
Server-Timing: xbench-total;dur=12.345, xbench-db;dur=1.234, xbench-app;dur=11.111
X-Bench-Queries: 3
Example:
Server-Timing: xbench-total;dur=52.300, xbench-db;dur=14.100, xbench-app;dur=38.200
xbench-total: whole request durationxbench-db: total DB time measured by wrapperxbench-app:max(0, total - db)(serialization/template/python time etc.)
You can inspect this in Chrome DevTools → Network → Timing
(or any browser that supports the Server-Timing spec).
django-xbench supports two configuration styles.
Use a single XBENCH dictionary to keep settings compact and grouped:
XBENCH = {
"ENABLED": True, # default: True
"LOG": False, # default: False
"LOG_LEVEL": "info", # "info" or "debug"
"SLOW_AGG": False, # default: False
}Older flat settings are still supported:
XBENCH_ENABLED = True
XBENCH_LOG_ENABLED = True
XBENCH_LOG_LEVEL = "debug"
XBENCH_SLOW_AGG_ENABLED = TrueThis feature keeps an in-memory rolling window of endpoint timings (per process) and shows the slowest endpoints by "damage" (total accumulated latency).
XBENCH = {"SLOW_AGG": True}In your project's urls.py:
from django.urls import include, path
urlpatterns = [
# ... your urls ...
path("__xbench__/", include("django_xbench.slowagg.urls")),
]- JSON snapshot:
GET /__xbench__/slow/?n=20 - HTML dashboard:
GET /__xbench__/slow/ui/?n=20
- Aggregation is in-memory per process. If you run multiple workers/processes, each one has its own rolling window.
- Intended for debugging / internal visibility, not as a full distributed APM.
- DB%: db_total / total
- Avg Q: average DB queries per request
- Damage: total accumulated latency in the window (sum of durations)
The dashboard only shows data after requests occur.
If you see "No data yet":
- Make sure
SLOW_AGGis enabled - Hit some endpoints (e.g.
/db-heavy/) - Refresh the dashboard
If using Django runserver with auto-reload, aggregation resets on reload.
XBENCH = {
"SLOW_AGG": True,
"SLOW_BUCKET_SECONDS": 10, # bucket size in seconds
"SLOW_BUCKET_COUNT": 60, # number of buckets (window = bucket_seconds * bucket_count)
"SLOW_ENDPOINT_CAP": 200, # max unique endpoints per bucket (overflow goes to "__other__")
}Note: this repo includes a bundled
examples/Django project used bypytest-django. In CI, we setPYTHONPATH=examplesto ensureexamples.config.settingscan be imported reliably.
If you want to see logs while testing:
This repository includes an examples/ Django project for manual testing.
Run it from the repository root:
# macOS / Linux
export DJANGO_SECRET_KEY="dev"
python -m examples.manage runserver --noreload# Windows PowerShell
$env:DJANGO_SECRET_KEY="dev"
python -m examples.manage runserver --noreloadTry a few endpoints:
curl -I http://127.0.0.1:8000/db-heavy/
curl -I http://127.0.0.1:8000/app-heavy/
curl -I http://127.0.0.1:8000/admin/login/- Python: 3.9+
- Django: 3.2+ (tested on 5.2)
Issues and PRs are welcome.
If you propose new metrics, please include:
- minimal reproducible example
- tests
- documentation update
MIT