Show HN: LocalGPT – 用 Rust 编写的、具有持久记忆的本地优先型 AI 助手
Show HN: LocalGPT – A local-first AI assistant in Rust with persistent memory

原始链接: https://github.com/localgpt-app/localgpt

## LocalGPT:一个私人的、自主的AI助手 LocalGPT是一个基于Rust的AI助手,设计用于本地、私密使用。它以单个约27MB的二进制文件形式分发(无需Node.js或Python等依赖),通过完全在您的机器上运行来优先保护数据隐私。 它使用markdown文件作为持久性内存——包括长期知识(MEMORY.md)、任务(HEARTBEAT.md)和个性(SOUL.md),并使用SQLite进行索引以实现快速搜索。LocalGPT支持多种LLM提供商,如Anthropic、OpenAI和Ollama,并且兼容OpenClaw的格式。 主要功能包括用于自主任务执行的心跳、CLI和GUI界面,以及作为守护进程运行时提供的HTTP API。配置通过`config.toml`处理,允许自定义默认模型和提供商密钥。它使用Tokio、Axum和fastembed等技术构建,并采用Apache-2.0许可。

## LocalGPT:本地优先的AI助手 LocalGPT是一款新的、基于Rust的AI助手,专为持久、私密的知识管理而设计。它在四夜时间内构建完成,模仿了OpenClaw助手模式,但避免了对Node.js、Docker和Python的依赖,编译成一个紧凑的约27MB的二进制文件。 主要功能包括基于markdown的持久化内存(与OpenClaw兼容)、全文和语义搜索(使用本地嵌入 – 无需API密钥)以及自主任务运行器。它提供CLI、Web界面和桌面GUI,支持Anthropic、OpenAI和Ollama等多种AI提供商。 开发者每天使用LocalGPT进行研究、副项目和知识积累,并指出其记忆会随着每次会话而改进。它在GitHub上可用 ([https://github.com/localgpt-app/localgpt](https://github.com/localgpt-app/localgpt)),并有一个网站 ([https://localgpt.app](https://localgpt.app)),创建者欢迎对架构和潜在功能的反馈。安装简单:`cargo install localgpt`。
相关文章

原文

A local device focused AI assistant built in Rust — persistent memory, autonomous tasks, ~27MB binary. Inspired by and compatible with OpenClaw.

cargo install localgpt

  • Single binary — no Node.js, Docker, or Python required
  • Local device focused — runs entirely on your machine, your memory data stays yours
  • Persistent memory — markdown-based knowledge store with full-text and semantic search
  • Autonomous heartbeat — delegate tasks and let it work in the background
  • Multiple interfaces — CLI, web UI, desktop GUI
  • Multiple LLM providers — Anthropic (Claude), OpenAI, Ollama
  • OpenClaw compatible — works with SOUL, MEMORY, HEARTBEAT markdown files and skills format
# Initialize configuration
localgpt config init

# Start interactive chat
localgpt chat

# Ask a single question
localgpt ask "What is the meaning of life?"

# Run as a daemon with heartbeat, HTTP API and web ui
localgpt daemon start

LocalGPT uses plain markdown files as its memory:

~/.localgpt/workspace/
├── MEMORY.md            # Long-term knowledge (auto-loaded each session)
├── HEARTBEAT.md         # Autonomous task queue
├── SOUL.md              # Personality and behavioral guidance
└── knowledge/           # Structured knowledge bank (optional)
    ├── finance/
    ├── legal/
    └── tech/

Files are indexed with SQLite FTS5 for fast keyword search, and sqlite-vec for semantic search with local embeddings

Stored at ~/.localgpt/config.toml:

[agent]
default_model = "claude-cli/opus"

[providers.anthropic]
api_key = "${ANTHROPIC_API_KEY}"

[heartbeat]
enabled = true
interval = "30m"
active_hours = { start = "09:00", end = "22:00" }

[memory]
workspace = "~/.localgpt/workspace"
# Chat
localgpt chat                     # Interactive chat
localgpt chat --session <id>      # Resume session
localgpt ask "question"           # Single question

# Daemon
localgpt daemon start             # Start background daemon
localgpt daemon stop              # Stop daemon
localgpt daemon status            # Show status
localgpt daemon heartbeat         # Run one heartbeat cycle

# Memory
localgpt memory search "query"    # Search memory
localgpt memory reindex           # Reindex files
localgpt memory stats             # Show statistics

# Config
localgpt config init              # Create default config
localgpt config show              # Show current config

When the daemon is running:

Endpoint Description
GET /health Health check
GET /api/status Server status
POST /api/chat Chat with the assistant
GET /api/memory/search?q=<query> Search memory
GET /api/memory/stats Memory statistics

Rust, Tokio, Axum, SQLite (FTS5 + sqlite-vec), fastembed, eframe

Apache-2.0

联系我们 contact @ memedata.com