展示HN:tomcp.org – 将任何URL转换为MCP服务器
Show HN: tomcp.org – Turn any URL into an MCP server

原始链接: https://github.com/Ami3466/tomcp

## Tomcp.org:使用AI与任何网站聊天 Tomcp.org 让你轻松将任何网站转换为可供AI聊天的资源。只需在URL前加上 `tomcp.org/` (例如 `tomcp.org/docs.stripe.com`) 即可生成与 Cursor、Claude 和 VS Code 等工具兼容的 MCP (模型上下文协议) 配置文件。这使得你的AI能够访问和理解网站的内容。 或者,你可以直接在 tomcp.org 上粘贴URL并提问,与网站聊天。 Tomcp.org 提供**免费**模型 (Llama 3, Hermes, Mistral, Gemma – 无API密钥时限制为每天5次请求) 和**付费**模型 (更大的Llama, DeepSeek, Mistral Large, Gemma 3, GPT OSS),后者需要 Cloudflare Workers AI API 密钥解锁。该密钥可绕过速率限制并使用你自己的 Cloudflare 配额。 该服务构建于 Cloudflare Workers AI 之上,确保免费使用,并适用于任何公共URL,无需复杂设置。你的API密钥安全地存储在浏览器的本地存储中。

## tomcp.org:将网址转换为MCP服务器 - 摘要 tomcp.org 是一种新工具,旨在通过多上下文提供者 (MCP) 框架简化向 AI 模型提供网络内容。用户只需在任何网址前加上“tomcp.org/”,即可为该页面立即创建一个 MCP 服务器。 主要优势在于数据更简洁:与原始的网络抓取或爬取不同,tomcp.org 将网页转换为简洁的 Markdown 格式,从而减少 token 使用量并提高 AI 的理解能力。它还允许“固定”文档作为永久且随时可用的资源,不同于按需的网络抓取。 该项目利用 Cloudflare Workers 提供免费、高容量的处理能力(每天 10 万次请求)。虽然简单,但它最适合静态网站,因为它无法有效处理 JavaScript 密集型或需要登录保护的网站。创建者承认这种简单性是故意的,优先考虑清晰的数据传输,而非复杂的功能。讨论强调了它快速将网络文档与 LLM 集成的潜力,并将其与更耗费资源的headless Chrome等方法进行了对比。
相关文章

原文

Turn any website into an MCP server + Chat with any website.

Convert any website URL into an MCP (Model Context Protocol) server config for your AI tools, or chat directly with any website's content.

Simply add tomcp.org/ before any URL:

tomcp.org/docs.stripe.com
tomcp.org/react.dev
tomcp.org/your-docs.com/api

Visit tomcp.org, paste a URL, and start chatting with any website's content using AI.

  • Cursor - ~/.cursor/mcp.json
  • Claude Desktop - ~/.claude/claude_desktop_config.json
  • Windsurf - ~/.codeium/windsurf/mcp_config.json
  • VS Code - .vscode/mcp.json
  • Cline - ~/.cline/mcp_settings.json
  1. Visit tomcp.org
  2. Enter any website URL
  3. Select your AI tool
  4. Copy the generated MCP config
  5. Add it to your tool's config file
  6. Restart your AI tool
  1. Visit tomcp.org
  2. Paste any website URL
  3. Click "Start Chat"
  4. Ask questions about the website's content
{
  "mcpServers": {
    "docs-stripe-com": {
      "url": "https://tomcp.org/docs.stripe.com"
    }
  }
}
curl -X POST https://tomcp.org/chat \
  -H "Content-Type: application/json" \
  -d '{"url": "docs.stripe.com", "message": "How do I create a payment intent?"}'

Free Models (No API Key Required)

These models are available for everyone with no setup:

  • Llama 3.1 8B (Meta) - Default model, fast and capable
  • Hermes 2 Pro (NousResearch) - Great for reasoning
  • Mistral 7B (Mistral) - Efficient instruction-following
  • Gemma 7B LoRA (Google) - Lightweight and fast

Premium Models (API Key Required)

Add your Cloudflare Workers AI API key to unlock these models:

  • Llama 3.3 70B (Meta) - Most powerful Llama model
  • DeepSeek R1 32B (DeepSeek) - Advanced reasoning
  • Mistral Large (Mistral) - Enterprise-grade
  • Gemma 3 12B (Google) - Latest Gemma
  • GPT OSS 120B/20B (OpenAI) - Open-source GPT variants

You can add your own Cloudflare Workers AI API key to:

  1. Unlock all premium models - Access larger, more capable models
  2. Bypass rate limits - No daily request limits
  3. Use your own quota - Charges go to your Cloudflare account
  1. Go to Cloudflare Workers AI
  2. Create an API token with Workers AI permissions
  3. Copy the token
  1. Start a chat session on tomcp.org
  2. Below the chat input, you'll see "Add API key from Cloudflare Workers AI"
  3. Paste your API key and click "Save"
  4. Premium models will now be unlocked in the dropdown

Where Is the API Key Stored?

  • Your API key is stored locally in your browser using localStorage
  • Key name: tomcp_api_key
  • The key is sent with each chat request but never stored on our servers
  • You can remove it anytime by clicking "Remove" in the API key section

The available models are fetched dynamically from the Cloudflare Workers AI API:

  1. Frontend calls GET /models endpoint on page load
  2. Worker fetches models from api.cloudflare.com/client/v4/accounts/{id}/ai/models/search
  3. Models are filtered to "Text Generation" tasks and cached for 5 minutes
  4. Frontend displays free models as enabled, premium models as disabled (until API key is added)
  1. User enters a URL and starts chatting
  2. Worker fetches the website content and converts HTML to Markdown
  3. Content is sent to the selected AI model with the user's message
  4. Response is returned to the user

Rate Limiting (Free Tier)

Without an API key:

  • 5 requests per IP per day

With your API key:

  • No rate limits (uses your Cloudflare account quota)
  • Frontend: Vanilla HTML/CSS/JS with Tailwind CSS
  • Backend: Cloudflare Workers
  • AI: Cloudflare Workers AI (multiple models)
  • Works with any public URL
  • No setup required - just paste the config
  • Free forever - powered by Cloudflare Workers
  • Chat with any website using AI
  • Side-by-side MCP Config + Chat interface
  • Multiple AI models - Choose from Llama, Mistral, Gemma, and more
  • Bring your own API key - Unlock premium models and bypass rate limits

Apache 2.0

联系我们 contact @ memedata.com