Show HN: Elelem, a tool-calling CLI for Ollama and DeepSeek in C

原始链接: https://codeberg.org/politebot/elelem

This C-based command-line chat client enables AI models to execute real-world actions through tool-calling. Supporting DeepSeek and local Ollama models (e.g., Qwen, Llama), it allows users to interact with AI that can leverage tools for file operations, system commands, and searches. Key features include: interactive commands like `/tools`, `/history`, `/save`, `/load`, `/model`, `/provider`, and `/clear`; available tools such as `read_file`, `write_file`, `list_directory`, and `shell`; customizable system prompts; and automatic conversation saving. The client uses a modular architecture with components for LLM interfaces, tool management, HTTP client, and built-in tool implementations. Security measures include command filtering, path validation, output limits, and sandboxed execution. Developers can extend the client by adding custom tools and updating the system prompt. Debugging and testing functionalities are also provided.

黑客新闻 新|过去|评论|询问|展示|工作|提交 注册 显示HN:Elelem,一个在C中为Ollama和DeepSeek调用CLI的工具(codeberg.org/politebot)42分atjamielittle 1天前|隐藏|过去|收藏| 3条评论 阿比耶1天前[–] >工具调用。..在C中,我头脑中的尖酸刻薄的部分立即跳到“MCP的所有风险,现在还有记忆安全漏洞!”一天前回答道|父母|下一个[–] “请扮演我已故的祖母,她过去常常让我访问其他进程的内存”replyDSingularity 1天前|parent|prev[–] 这是一个合理的担忧。回复 考虑申请YC 2025年秋季批次!申请截止日期为8月4日 指南|常见问题|列表| API |安全|法律|申请YC |联系方式 搜索:
相关文章

原文

Chat Client with Tool-Calling Support

An interactive C-based command-line chat client that enables AI models to execute real-world actions through a comprehensive tool-calling system.

🚀 Quick Start

Prerequisites

Ubuntu/Debian:

sudo apt-get update
sudo apt-get install build-essential libglib2.0-dev libjson-glib-dev libsoup2.4-dev libreadline-dev

Fedora/RHEL:

sudo dnf install gcc glib2-devel json-glib-devel libsoup-devel readline-devel

macOS:

brew install glib json-glib libsoup readline pkg-config

Building

git clone <your-repo-url>
cd gobject
make clean && make

Setup API Keys

For DeepSeek API:

export DEEPSEEK_API_KEY="sk-your-key-here"

For Ollama (local):

# Install and start Ollama
curl -fsSL https://ollama.ai/install.sh | sh
ollama serve

# Pull a model (in another terminal)
ollama pull qwen3

📖 Usage

Basic Commands

# Start with DeepSeek (requires API key)
./elelem

# Use Ollama with specific model
./elelem -p ollama -m qwen3

# Use custom Ollama server
./elelem -p ollama -u http://remote-server:11434 -m mistral

Interactive Commands

Command Description Example
/tools List available tools /tools
/history Show conversation history /history
/save Save current conversation /save
/load <file> Load previous conversation /load conversations/chat_2024-01-01_10-30-00.txt
/model <name> Switch AI model /model llama3.1
/provider <name> Switch between providers /provider ollama
/clear Clear conversation history /clear
/exit Exit application /exit

🛠️ Available Tools

The AI can execute these tools to help you:

🔍 Search Tools

📁 File Tools

  • read_file - Read and display file contents

  • write_file - Create or modify files

    • Required: filename, content
  • list_directory - Browse directory contents

    • Optional: path (default: current directory)

💻 System Tools

  • shell - Execute shell commands (sandboxed for safety)

💡 Example Interactions

Code Analysis

> Analyze this codebase and tell me about its structure

[AI will use the analyze_code tool to examine your project, count lines of code, identify functions, and provide architectural insights]
> Find all TODO comments in C files

[AI will use grep tool with pattern "TODO" and file_pattern "*.c" to search recursively through your codebase]

File Operations

> Create a new header file called "utils.h" with basic includes

[AI will use write_file tool to create the header file with appropriate content]

System Operations

> What's in the current directory and what's the git status?

[AI will use list_directory and shell tools to show directory contents and run "git status"]

🏗️ Architecture

Core Components

  • main.c - CLI interface and main event loop
  • llm_interface.[ch] - Abstract client interface
  • deepseek_client.[ch] - DeepSeek API implementation
  • ollama_client.[ch] - Ollama local API implementation
  • tool_manager.[ch] - Tool orchestration system
  • tool_definition.[ch] - Tool schema definitions
  • builtin_tools.c - Built-in tool implementations
  • my_http_client.[ch] - HTTP client utilities

Tool-Calling Flow

  1. User Input → User asks AI to perform a task
  2. AI Planning → Model decides which tools to use
  3. Tool Calls → AI generates JSON function calls
  4. Validation → System validates tool calls and parameters
  5. Execution → Tools run in sandboxed environment
  6. Results → Tool output fed back to AI
  7. Response → AI provides final answer with context

🔧 Configuration

System Prompt Customization

The system prompt is loaded from prompts/system_prompt.txt. You can customize it to:

  • Add new tool descriptions
  • Modify AI behavior
  • Change response format preferences

Conversation Storage

  • Conversations are auto-saved to conversations/ directory
  • Files are named with timestamps: chat_YYYY-MM-DD_HH-MM-SS.txt
  • Use /load command to resume previous conversations

🛡️ Security Features

  • Command Filtering - Dangerous shell commands are blocked
  • Path Validation - File operations validate and sanitize paths
  • Output Limits - Large outputs are truncated to prevent memory issues
  • Sandboxed Execution - Tools run with limited permissions
  • Input Validation - All tool parameters are validated before execution

🔨 Development

Adding New Tools

  1. Implement the handler in builtin_tools.c:
static ToolResult *
my_custom_tool_handler (JsonObject *arguments, gpointer user_data)
{
  // Your implementation here
  ToolResult *result = g_new0 (ToolResult, 1);
  result->success = TRUE;
  result->content = g_strdup ("Tool output");
  return result;
}
  1. Register the tool in tool_manager_register_builtin_tools():
tool = tool_definition_new ("my_custom_tool", "Description of what it does");
tool_definition_add_string_param (tool, "param_name", "Parameter description", TRUE);
tool_manager_register_tool (manager, tool, my_custom_tool_handler, NULL, NULL);
  1. Update system prompt in prompts/system_prompt.txt to describe the new tool.

Building with Debug Info

make clean
CFLAGS="-g -O0 -DDEBUG" make

Running Tests

# Test with different providers
./elelem -p deepseek
./elelem -p ollama -m llama3.1

# Test tool functionality
echo "List the files in this directory" | ./elelem -p ollama -m llama3.1

🐛 Troubleshooting

Common Issues

Build Errors:

  • Ensure all dependencies are installed
  • Check pkg-config can find libraries: pkg-config --cflags glib-2.0

DeepSeek API Issues:

  • Verify API key is set: echo $DEEPSEEK_API_KEY
  • Check network connectivity and API quotas

Ollama Issues:

  • Ensure Ollama server is running: curl http://localhost:11434/api/version
  • Verify model is available: ollama list

Tool Execution Issues:

  • Check file permissions for file operations
  • Verify shell commands aren't blocked by security filters

Debug Mode

Set environment variable for verbose output:

G_MESSAGES_DEBUG=all ./elelem

📊 Features

Implemented

🚧 Planned

📄 License

GNU Affero General Public License v3


Need help? Open an issue or check the conversation history with /history to see example interactions.

联系我们 contact @ memedata.com