展示HN:一个中间人代理,用于查看你的LLM工具正在发送什么。
Show HN: A MitM proxy to see what your LLM tools are sending

原始链接: https://github.com/jmuncor/sherlock

## Sherlock:LLM API 流量与 Token 使用情况检查器 Sherlock 是一款用于监控和优化您与大型语言模型 (LLM) 交互的工具。它充当透明代理,拦截 Claude 等 LLM API 的 HTTPS 流量,并在终端仪表盘中显示实时 Token 使用情况。 **主要特性:** * **Token 追踪:** 监控每个请求的 Token 消耗。 * **上下文窗口监控:** 通过彩色编码的油表可视化累积 Token 使用量。 * **Prompt 调试:** 自动以 Markdown 和 JSON 格式保存 Prompt。 * **零代码集成:** 通过代理环境变量与现有工具配合使用。 **安装与使用:** 1. 克隆仓库并在开发模式下安装 (`pip install -e .`)。需要 Python 3.10+ 和 Node.js 用于某些应用。 2. Sherlock 在首次运行时会引导您安装 mitmproxy CA 证书。 3. 使用 `sherlock run <您的 LLM 工具>` 或特定命令(如 `sherlock claude`)运行命令。 仪表盘提供 Token 使用量、请求日志和已保存 Prompt 的实时视图,帮助您理解和优化 LLM 成本和 Prompt。欢迎通过 Pull Request 贡献!

## Sherlock:用于LLM洞察的中间人代理 一种名为Sherlock的新工具允许用户检查LLM工具(如Claude Code和Gemini)与API之间的通信,揭示正在发送的确切提示,并实时跟踪token使用情况。Sherlock由jmuncor出于好奇心构建,作为“中间人”代理运行,捕获请求并将提示保存为Markdown和JSON格式。 虽然通过简单的CLI即可轻松使用,但有人担心该工具无条件禁用TLS验证,从而带来安全风险。创建者承认这一点,并正在探索安全改进,包括可能使用HTTP中继。用户也建议将Sherlock集成到现有的工具中,如mitmproxy,以提高信任度和功能。 该工具已被证明对调试昂贵的提示、优化上下文窗口使用以及理解LLM的行为很有用,一些用户正在使用Envoy Proxy和Docker等工具创建类似解决方案。开发者正在积极寻求反馈,并考虑诸如OpenTelemetry集成和改进的上下文窗口管理等功能。
相关文章

原文

LLM API Traffic Inspector & Token Usage Dashboard

Python License Platform mitmproxy Anthropic

InstallationQuick StartFeaturesCommandsContributing


Sherlock is a transparent proxy that intercepts HTTPS traffic to LLM APIs and displays real-time token usage in a beautiful terminal dashboard. Track your AI costs, debug prompts, and monitor context window usage across your development session.

  • Track Token Usage: See exactly how many tokens each request consumes
  • Monitor Context Windows: Visual fuel gauge shows cumulative usage against your limit
  • Debug Prompts: Automatically saves every prompt as markdown and JSON for review
  • Zero Code Changes: Works with any tool that respects proxy environment variables
# Clone the repository
git clone https://github.com/yourusername/sherlock.git
cd sherlock

# Install in development mode
pip install -e .
  • Python 3.10+
  • Node.js (for intercepting Node.js applications like Claude Code)

On first run, Sherlock will:

  • Generate the mitmproxy CA certificate
  • Prompt you to install it in your system trust store
  • Ask where to save intercepted prompts

In a separate terminal, use Sherlock to proxy your commands:

# For Claude Code
sherlock claude

# For any command
sherlock run --node your-llm-tool

That's it! Watch the dashboard update in real-time as you interact with LLM APIs.

┌─────────────────────────────────────────────────────────────┐
│  🔍 SHERLOCK - LLM Traffic Inspector                        │
├─────────────────────────────────────────────────────────────┤
│  Context Usage  ████████████░░░░░░░░░░░░░░░░  42%           │
│                 (84,231 / 200,000 tokens)                   │
├─────────────────────────────────────────────────────────────┤
│  Time     Provider    Model                      Tokens     │
│  14:23:01 Anthropic   claude-sonnet-4-20250514   12,847     │
│  14:23:45 Anthropic   claude-sonnet-4-20250514   8,234      │
│  14:24:12 Anthropic   claude-sonnet-4-20250514   15,102     │
├─────────────────────────────────────────────────────────────┤
│  Last Prompt: "Can you help me refactor this function..."   │
└─────────────────────────────────────────────────────────────┘

Every intercepted request is saved as:

  • Markdown - Human-readable format with metadata
  • JSON - Raw API request body for debugging

Visual progress bar with color-coded warnings:

  • 🟢 Green: < 50% usage
  • 🟡 Yellow: 50-80% usage
  • 🔴 Red: > 80% usage
Command Description
sherlock Start the proxy and dashboard
sherlock start Same as above (explicit)
sherlock claude Run Claude Code with proxy configured
sherlock run <cmd> Run any command with proxy configured
sherlock run --node <cmd> Run Node.js app with proxy configured
sherlock check-certs Verify CA certificate installation
sherlock install-certs Show certificate installation instructions
sherlock env Print proxy environment variables
sherlock start [OPTIONS]

Options:
  -p, --port NUM          Proxy port (default: 8080)
  -l, --limit NUM         Token limit for fuel gauge (default: 200000)
  --persist               Save token history across sessions
  --skip-cert-check       Skip certificate verification
┌──────────────────────────────────────────────────────────────────┐
│                      Your LLM Application                        │
│              (with proxy environment variables)                  │
└─────────────────────────────┬────────────────────────────────────┘
                              │ HTTPS
                              ▼
┌──────────────────────────────────────────────────────────────────┐
│                     mitmproxy (port 8080)                        │
│                   + Sherlock Interceptor                         │
└─────────────────────────────┬────────────────────────────────────┘
                              │ Parsed events
                              ▼
┌──────────────────────────────────────────────────────────────────┐
│                    Sherlock Dashboard                            │
│              Token tracking • Request log • Prompt preview       │
└──────────────────────────────────────────────────────────────────┘
                              │
                              ▼
                    ~/.sherlock/prompts/
                    ├── 2024-01-15_14-23-01_anthropic.md
                    └── 2024-01-15_14-23-01_anthropic.json
Provider Status
Anthropic (Claude) ✅ Supported
OpenAI 🔜 Coming soon
Google Gemini 🔜 Coming soon

Sherlock uses mitmproxy to intercept HTTPS traffic. On first run, it will guide you through installing the CA certificate.

macOS:

sudo security add-trusted-cert -d -r trustRoot \
  -k /Library/Keychains/System.keychain \
  ~/.mitmproxy/mitmproxy-ca-cert.pem

Ubuntu/Debian:

sudo cp ~/.mitmproxy/mitmproxy-ca-cert.pem \
  /usr/local/share/ca-certificates/mitmproxy-ca-cert.crt
sudo update-ca-certificates

For manual proxy configuration:

export HTTP_PROXY="http://127.0.0.1:8080"
export HTTPS_PROXY="http://127.0.0.1:8080"
export NODE_EXTRA_CA_CERTS="$HOME/.mitmproxy/mitmproxy-ca-cert.pem"

Or use the helper:

Contributions are welcome! Here's how you can help:

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request
git clone https://github.com/yourusername/sherlock.git
cd sherlock
python -m venv venv
source venv/bin/activate
pip install -e .

To add support for a new LLM provider:

  1. Add the API host to sherlock/config.py
  2. Create a parser function in sherlock/parser.py
  3. Update the parse_request() function to route to your parser

This project is licensed under the MIT License - see the LICENSE file for details.


See what's really being sent to the LLM. Learn. Optimize. Repeat.

联系我们 contact @ memedata.com