MCP in LM Studio

原始链接: https://lmstudio.ai/blog/lmstudio-v0.3.17

LM Studio 0.3.17 introduces Model Context Protocol (MCP) support, enabling connection to MCP servers for enhanced LLM functionality with tools and resources. Users can add local or remote MCP servers via `mcp.json` editing or the "Add to LM Studio" button. A Hugging Face MCP server example is provided, emphasizing security cautions when installing MCPs. Tool calls require user confirmation with options to always allow, deny, or allow once. Other updates include: * Support for 11 new languages. * "Solarized Dark" theme. * Deeplink button for easy MCP server installation. * Various bug fixes and UI improvements. * Displaying message generation stats, opening URLs, system prompt editor shortcut, and pinning the downloads panel. Version 0.3.17 improves tool call handling, UI, and overall user experience.

LM Studio,一款运行本地大型语言模型 (LLM) 的工具,正在迅速普及,尤其是在苹果硅芯片的 Mac 上,这得益于其用户友好的界面和对 MLX 模型的支持。一些用户甚至投资购买高端硬件,例如 Mac Studio,以最大限度地提升其性能。讨论中强调了 LM Studio 与基于命令行的替代方案(如 vllm/ollama)相比的优势,特别是它能够自定义输出设置的能力。然而,其闭源性质和一些用户认为过于花哨的 UI 也成为争议点。 最近加入的 MCP(模型上下文协议)支持也成为讨论话题。一些人认为它对于利用“工具即服务 (Tools as a Service)”非常有用,另一些人则认为它是一个寻找问题的解决方案,本质上是对现有“工具”模式的重新包装。 像 Open WebUI 这样的替代方案也被提及,它可以与 Ollama 等后端集成,并提供类似的功能。对话还涉及到运行大型模型所需的 RAM 和 VRAM 的重要性,以及在本地网络上运行 LLM 的挑战。
相关文章

原文

LM Studio 0.3.17 introduces Model Context Protocol (MCP) support, allowing you to connect your favorite MCP servers to the app and use them with local models.

LM Studio supports both local and remote MCP servers. You can add MCPs by editing the app's mcp.json file or via the new "Add to LM Studio" Button, when available.

Also new in this release:

  • Support for 11 new languages, thanks to our community localizers. LM Studio is now available in 33 languages.
  • Many bug fixes, as well as improvements to the UI, including a new theme: 'Solarized Dark'.
Upgrade via in-app update, or from https://lmstudio.ai/download.

Model Context Protocol (MCP) is a set of interfaces for providing LLMs access to tools and other resources. It was originally introduced by Anthropic, and is developed on Github.

Terminology:

  • "MCP Server": program that provides tools and access to resources. For example Stripe, GitHub, or Notion make MCP servers.
  • "MCP Host": applications (like LM Studio or Claude Desktop) that can connect to MCP servers, and make their resources available to models.

Install new servers: mcp.json

Switch to the "Program" tab in the right hand sidebar. Click Install > Edit mcp.json.

This will open the mcp.json file in the in-app editor. You can add MCP servers by editing this file.

Edit mcp.json using the in-app editor

LM Studio currently follows Cursor's mcp.json notation.

Example MCP to try: Hugging Face MCP Server

This MCP server provides access to functions like model and dataset search.

{
  "mcpServers": {
    "hf-mcp-server": {
      "url": "https://huggingface.co/mcp",
      "headers": {
        "Authorization": "Bearer <YOUR_HF_TOKEN>"
      }
    }
  }
}
You will need to replace <YOUR_HF_TOKEN> with your actual Hugging Face token. Learn more here.

Be cautious

Never install MCPs from untrusted sources.

Some MCP servers can run arbitrary code, access your local files, and use your network connection. Always be cautious when installing and using MCP servers. If you don't trust the source, don't install it.


When a model calls a tool, LM Studio will show a confirmation dialog to the user. This allows you to review the tool call arguments before executing it, including editing the arguments if needed.

Tool call confirmation dialog

You can choose to always allow a given tool, always deny it, or allow it only once.

If you choose to always allow a tool, LM Studio will not show the confirmation dialog for that tool in the future. You can manage this later in App Settings > Tools & Integrations.

MCP support: more technical details

  • When you save the mcp.json file, LM Studio will automatically load the MCP servers defined in it. We spawn a separate process for each MCP server.

  • For local MCP servers that rely on npx or uvx (or any other program on your machine), you need to ensure those tools are installed and available in your system's PATH.

  • mcp.json lives in ~/.lmstudio/mcp.json on macOS and Linux, and in %USERPROFILE%/.lmstudio/mcp.json on Windows. It's recommended to use the in-app editor to edit this file.

If you're running into bugs, please open an issue on our bug tracker: https://github.com/lmstudio-ai/lmstudio-bug-tracker/issues.


We're also introducing a one-click way to add MCP servers to LM Studio using a deeplink button.

Enter your MCP JSON entry to generate a deeplink for the Add to LM Studio button.

👇 This is a real interactive tool that you can use to create your own MCP install links. Try it!