LLM API Traffic Inspector & Token Usage Dashboard
Installation • Quick Start • Features • Commands • Contributing
Sherlock is a transparent proxy that intercepts HTTPS traffic to LLM APIs and displays real-time token usage in a beautiful terminal dashboard. Track your AI costs, debug prompts, and monitor context window usage across your development session.
- Track Token Usage: See exactly how many tokens each request consumes
- Monitor Context Windows: Visual fuel gauge shows cumulative usage against your limit
- Debug Prompts: Automatically saves every prompt as markdown and JSON for review
- Zero Code Changes: Works with any tool that respects proxy environment variables
# Clone the repository
git clone https://github.com/yourusername/sherlock.git
cd sherlock
# Install in development mode
pip install -e .- Python 3.10+
- Node.js (for intercepting Node.js applications like Claude Code)
On first run, Sherlock will:
- Generate the mitmproxy CA certificate
- Prompt you to install it in your system trust store
- Ask where to save intercepted prompts
In a separate terminal, use Sherlock to proxy your commands:
# For Claude Code
sherlock claude
# For any command
sherlock run --node your-llm-toolThat's it! Watch the dashboard update in real-time as you interact with LLM APIs.
┌─────────────────────────────────────────────────────────────┐
│ 🔍 SHERLOCK - LLM Traffic Inspector │
├─────────────────────────────────────────────────────────────┤
│ Context Usage ████████████░░░░░░░░░░░░░░░░ 42% │
│ (84,231 / 200,000 tokens) │
├─────────────────────────────────────────────────────────────┤
│ Time Provider Model Tokens │
│ 14:23:01 Anthropic claude-sonnet-4-20250514 12,847 │
│ 14:23:45 Anthropic claude-sonnet-4-20250514 8,234 │
│ 14:24:12 Anthropic claude-sonnet-4-20250514 15,102 │
├─────────────────────────────────────────────────────────────┤
│ Last Prompt: "Can you help me refactor this function..." │
└─────────────────────────────────────────────────────────────┘
Every intercepted request is saved as:
- Markdown - Human-readable format with metadata
- JSON - Raw API request body for debugging
Visual progress bar with color-coded warnings:
- 🟢 Green: < 50% usage
- 🟡 Yellow: 50-80% usage
- 🔴 Red: > 80% usage
| Command | Description |
|---|---|
sherlock |
Start the proxy and dashboard |
sherlock start |
Same as above (explicit) |
sherlock claude |
Run Claude Code with proxy configured |
sherlock run <cmd> |
Run any command with proxy configured |
sherlock run --node <cmd> |
Run Node.js app with proxy configured |
sherlock check-certs |
Verify CA certificate installation |
sherlock install-certs |
Show certificate installation instructions |
sherlock env |
Print proxy environment variables |
sherlock start [OPTIONS]
Options:
-p, --port NUM Proxy port (default: 8080)
-l, --limit NUM Token limit for fuel gauge (default: 200000)
--persist Save token history across sessions
--skip-cert-check Skip certificate verification┌──────────────────────────────────────────────────────────────────┐
│ Your LLM Application │
│ (with proxy environment variables) │
└─────────────────────────────┬────────────────────────────────────┘
│ HTTPS
▼
┌──────────────────────────────────────────────────────────────────┐
│ mitmproxy (port 8080) │
│ + Sherlock Interceptor │
└─────────────────────────────┬────────────────────────────────────┘
│ Parsed events
▼
┌──────────────────────────────────────────────────────────────────┐
│ Sherlock Dashboard │
│ Token tracking • Request log • Prompt preview │
└──────────────────────────────────────────────────────────────────┘
│
▼
~/.sherlock/prompts/
├── 2024-01-15_14-23-01_anthropic.md
└── 2024-01-15_14-23-01_anthropic.json
| Provider | Status |
|---|---|
| Anthropic (Claude) | ✅ Supported |
| OpenAI | 🔜 Coming soon |
| Google Gemini | 🔜 Coming soon |
Sherlock uses mitmproxy to intercept HTTPS traffic. On first run, it will guide you through installing the CA certificate.
macOS:
sudo security add-trusted-cert -d -r trustRoot \
-k /Library/Keychains/System.keychain \
~/.mitmproxy/mitmproxy-ca-cert.pemUbuntu/Debian:
sudo cp ~/.mitmproxy/mitmproxy-ca-cert.pem \
/usr/local/share/ca-certificates/mitmproxy-ca-cert.crt
sudo update-ca-certificatesFor manual proxy configuration:
export HTTP_PROXY="http://127.0.0.1:8080"
export HTTPS_PROXY="http://127.0.0.1:8080"
export NODE_EXTRA_CA_CERTS="$HOME/.mitmproxy/mitmproxy-ca-cert.pem"Or use the helper:
Contributions are welcome! Here's how you can help:
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
git clone https://github.com/yourusername/sherlock.git
cd sherlock
python -m venv venv
source venv/bin/activate
pip install -e .To add support for a new LLM provider:
- Add the API host to
sherlock/config.py - Create a parser function in
sherlock/parser.py - Update the
parse_request()function to route to your parser
This project is licensed under the MIT License - see the LICENSE file for details.
See what's really being sent to the LLM. Learn. Optimize. Repeat.