展示 HN:一个用于管理 Lambda 云 GPU 实例的快速 CLI 和 MCP 服务器
Show HN: A fast CLI and MCP server for managing Lambda cloud GPU instances

原始链接: https://github.com/Strand-AI/lambda-cli

## Lambda CLI & MCP 服务器总结 Lambda 是一个社区构建的工具,用于管理 Lambda 云 GPU 实例,提供命令行界面 (CLI - `lambda`) 和 AI 驱动的管理服务器 (MCP - `lambda-mcp`)。它可以通过 Brew、Cargo 或 GitHub 发布版安装,需要 Lambda API 密钥(通过环境变量或密钥管理命令设置)。 `lambda` CLI 允许直接控制:列出 GPU、启动/停止实例以及等待 GPU 可用。`lambda-mcp` 服务器支持通过 AI 助手(如 Claude)进行管理,响应自然语言命令(例如“启动一个 H100 实例”)。 当实例准备就绪时,可以配置 Slack、Discord 或 Telegram 通知。MCP 服务器在设置环境变量时会自动发送这些通知。 关键命令包括 `lambda list`、`lambda start`、`lambda stop` 和 `lambda find`。`lambda-mcp` 通过 `npx @strand-ai/lambda-mcp` 与 Claude 集成,实现免手动基础设施管理。该项目正在积极开发中,并提供了构建和测试说明。

一位开发者创建了一个非官方的命令行界面(CLI)和MCP服务器(使用Rust编写,MIT许可),用于管理Lambda云GPU实例,允许AI代理自动化实例管理。该工具使⽤户能够通过简单的命令在Claude Code或Cursor等AI环境中轻松启动、终止和SSH连接到H100等GPU。 主要功能包括在实例准备就绪时通过Slack、Discord或Telegram发送通知,以及1Password集成用于API密钥管理。一个独立的CLI提供相同的功能。 该项目是社区努力成果,与Lambda官方无关。然而,一条批评使用Rust的评论引发了版主的回应,提醒用户在平台上保持尊重和建设性的互动,并参考了网站的指南。
相关文章

原文

Caution

UNOFFICIAL PROJECT — This is a community-built tool, not affiliated with or endorsed by Lambda.

CI License: MIT npm MCP Install in VS Code Install in Cursor

A fast CLI and MCP server for managing Lambda cloud GPU instances.

Two ways to use it:

  • CLI (lambda) - Direct terminal commands for managing GPU instances
  • MCP Server (lambda-mcp) - Let AI assistants like Claude manage your GPU infrastructure
brew install strand-ai/tap/lambda-cli
cargo install --git https://github.com/Strand-AI/lambda-cli

Download from GitHub Releases.

Get your API key from the Lambda dashboard.

Option 1: Environment Variable

export LAMBDA_API_KEY=<your-key>

Option 2: Command (1Password, etc.)

export LAMBDA_API_KEY_COMMAND="op read op://Personal/Lambda/api-key"

The command is executed at startup and its output is used as the API key. This works with any secret manager.

Get notified on Slack, Discord, or Telegram when your instance is ready and SSH-able.

Set one or more of these environment variables:

# Slack (incoming webhook)
export LAMBDA_NOTIFY_SLACK_WEBHOOK="https://hooks.slack.com/services/T00/B00/XXX"

# Discord (webhook URL)
export LAMBDA_NOTIFY_DISCORD_WEBHOOK="https://discord.com/api/webhooks/123/abc"

# Telegram (bot token + chat ID)
export LAMBDA_NOTIFY_TELEGRAM_BOT_TOKEN="123456:ABC-DEF..."
export LAMBDA_NOTIFY_TELEGRAM_CHAT_ID="123456789"

Slack: Create an Incoming Webhook in your workspace.

Discord: In channel settings → Integrations → Webhooks → New Webhook → Copy Webhook URL.

Telegram:

  1. Message @BotFather/newbot → copy the token
  2. Message your bot, then visit https://api.telegram.org/bot<TOKEN>/getUpdates to find your chat ID

Command Description
lambda list Show available GPU types with pricing and availability
lambda running Show your running instances
lambda start Launch a new instance
lambda stop Terminate an instance
lambda find Poll until a GPU type is available, then launch

List available GPUs:

Start an instance:

lambda start --gpu gpu_1x_a10 --ssh my-key --name "dev-box"

Stop an instance:

lambda stop --instance-id <id>

Wait for availability and auto-launch:

lambda find --gpu gpu_8x_h100 --ssh my-key --interval 30
Flag Description
-g, --gpu Instance type (required)
-s, --ssh SSH key name (required)
-n, --name Instance name
-r, --region Region (auto-selects if omitted)
-f, --filesystem Filesystem to attach (must be in same region)
--no-notify Disable notifications even if env vars are set
Flag Description
-g, --gpu Instance type to wait for (required)
-s, --ssh SSH key name (required)
--interval Poll interval in seconds (default: 10)
-n, --name Instance name when launched
-f, --filesystem Filesystem to attach when launched
--no-notify Disable notifications even if env vars are set

Notifications are automatic when env vars are configured. Use --no-notify to disable:

lambda start --gpu gpu_1x_a10 --ssh my-key --no-notify

The lambda-mcp binary is an MCP (Model Context Protocol) server that lets AI assistants manage your Lambda infrastructure.

The easiest way to use lambda-mcp is via npx—no installation required:

npx @strand-ai/lambda-mcp
Flag Description
--eager Execute API key command at startup instead of on first use

When using LAMBDA_API_KEY_COMMAND, the MCP server defers command execution until the first API request by default. This avoids unnecessary delays when starting Claude Code if you don't use Lambda tools in every session.

Use --eager to execute the command at startup instead:

npx @strand-ai/lambda-mcp --eager

Note: The CLI (lambda) always executes the API key command at startup since it's used for immediate operations.

Tool Description
list_gpu_types List all available GPU instance types with pricing, specs, and current availability
start_instance Launch a new GPU instance (auto-notifies if configured)
stop_instance Terminate a running instance
list_running_instances Show all running instances with status and connection details
check_availability Check if a specific GPU type is available

When notification environment variables are configured, the MCP server automatically sends notifications when instances become SSH-able. No additional flags needed—just set the LAMBDA_NOTIFY_* env vars and launch instances as usual.

claude mcp add lambda -s user -e LAMBDA_API_KEY=your-api-key -- npx -y @strand-ai/lambda-mcp

With 1Password CLI:

claude mcp add lambda -s user -e LAMBDA_API_KEY_COMMAND="op read op://Personal/Lambda/api-key" -- npx -y @strand-ai/lambda-mcp

Then restart Claude Code.

Once configured, you can ask Claude things like:

  • "What GPUs are currently available on Lambda?"
  • "Launch an H100 instance with my ssh key 'macbook'"
  • "Show me my running instances"
  • "Check if any A100s are available"
  • "Terminate instance i-abc123"

# Build
cargo build

# Run tests
cargo test

# Run CLI
cargo run --bin lambda -- list

# Run MCP server
cargo run --bin lambda-mcp

To create a release:

  1. Update the version in Cargo.toml
  2. Merge to main — this automatically:
    • Creates a git tag
    • Builds binaries for all platforms
    • Publishes to npm
    • Updates the Homebrew formula
联系我们 contact @ memedata.com