Chat Client with Tool-Calling Support
An interactive C-based command-line chat client that enables AI models to execute real-world actions through a comprehensive tool-calling system.
🚀 Quick Start
Prerequisites
Ubuntu/Debian:
sudo apt-get update
sudo apt-get install build-essential libglib2.0-dev libjson-glib-dev libsoup2.4-dev libreadline-dev
Fedora/RHEL:
sudo dnf install gcc glib2-devel json-glib-devel libsoup-devel readline-devel
macOS:
brew install glib json-glib libsoup readline pkg-config
Building
git clone <your-repo-url>
cd gobject
make clean && make
Setup API Keys
For DeepSeek API:
export DEEPSEEK_API_KEY="sk-your-key-here"
For Ollama (local):
# Install and start Ollama
curl -fsSL https://ollama.ai/install.sh | sh
ollama serve
# Pull a model (in another terminal)
ollama pull qwen3
📖 Usage
Basic Commands
# Start with DeepSeek (requires API key)
./elelem
# Use Ollama with specific model
./elelem -p ollama -m qwen3
# Use custom Ollama server
./elelem -p ollama -u http://remote-server:11434 -m mistral
Interactive Commands
Command | Description | Example |
---|---|---|
/tools |
List available tools | /tools |
/history |
Show conversation history | /history |
/save |
Save current conversation | /save |
/load <file> |
Load previous conversation | /load conversations/chat_2024-01-01_10-30-00.txt |
/model <name> |
Switch AI model | /model llama3.1 |
/provider <name> |
Switch between providers | /provider ollama |
/clear |
Clear conversation history | /clear |
/exit |
Exit application | /exit |
🛠️ Available Tools
The AI can execute these tools to help you:
🔍 Search Tools
📁 File Tools
-
read_file
- Read and display file contents -
write_file
- Create or modify files- Required:
filename
,content
- Required:
-
list_directory
- Browse directory contents- Optional:
path
(default: current directory)
- Optional:
💻 System Tools
shell
- Execute shell commands (sandboxed for safety)
💡 Example Interactions
Code Analysis
> Analyze this codebase and tell me about its structure
[AI will use the analyze_code tool to examine your project, count lines of code, identify functions, and provide architectural insights]
File Search
> Find all TODO comments in C files
[AI will use grep tool with pattern "TODO" and file_pattern "*.c" to search recursively through your codebase]
File Operations
> Create a new header file called "utils.h" with basic includes
[AI will use write_file tool to create the header file with appropriate content]
System Operations
> What's in the current directory and what's the git status?
[AI will use list_directory and shell tools to show directory contents and run "git status"]
🏗️ Architecture
Core Components
main.c
- CLI interface and main event loopllm_interface.[ch]
- Abstract client interfacedeepseek_client.[ch]
- DeepSeek API implementationollama_client.[ch]
- Ollama local API implementationtool_manager.[ch]
- Tool orchestration systemtool_definition.[ch]
- Tool schema definitionsbuiltin_tools.c
- Built-in tool implementationsmy_http_client.[ch]
- HTTP client utilities
Tool-Calling Flow
- User Input → User asks AI to perform a task
- AI Planning → Model decides which tools to use
- Tool Calls → AI generates JSON function calls
- Validation → System validates tool calls and parameters
- Execution → Tools run in sandboxed environment
- Results → Tool output fed back to AI
- Response → AI provides final answer with context
🔧 Configuration
System Prompt Customization
The system prompt is loaded from prompts/system_prompt.txt
. You can customize it to:
- Add new tool descriptions
- Modify AI behavior
- Change response format preferences
Conversation Storage
- Conversations are auto-saved to
conversations/
directory - Files are named with timestamps:
chat_YYYY-MM-DD_HH-MM-SS.txt
- Use
/load
command to resume previous conversations
🛡️ Security Features
- Command Filtering - Dangerous shell commands are blocked
- Path Validation - File operations validate and sanitize paths
- Output Limits - Large outputs are truncated to prevent memory issues
- Sandboxed Execution - Tools run with limited permissions
- Input Validation - All tool parameters are validated before execution
🔨 Development
Adding New Tools
- Implement the handler in
builtin_tools.c
:
static ToolResult *
my_custom_tool_handler (JsonObject *arguments, gpointer user_data)
{
// Your implementation here
ToolResult *result = g_new0 (ToolResult, 1);
result->success = TRUE;
result->content = g_strdup ("Tool output");
return result;
}
- Register the tool in
tool_manager_register_builtin_tools()
:
tool = tool_definition_new ("my_custom_tool", "Description of what it does");
tool_definition_add_string_param (tool, "param_name", "Parameter description", TRUE);
tool_manager_register_tool (manager, tool, my_custom_tool_handler, NULL, NULL);
- Update system prompt in
prompts/system_prompt.txt
to describe the new tool.
Building with Debug Info
make clean
CFLAGS="-g -O0 -DDEBUG" make
Running Tests
# Test with different providers
./elelem -p deepseek
./elelem -p ollama -m llama3.1
# Test tool functionality
echo "List the files in this directory" | ./elelem -p ollama -m llama3.1
🐛 Troubleshooting
Common Issues
Build Errors:
- Ensure all dependencies are installed
- Check pkg-config can find libraries:
pkg-config --cflags glib-2.0
DeepSeek API Issues:
- Verify API key is set:
echo $DEEPSEEK_API_KEY
- Check network connectivity and API quotas
Ollama Issues:
- Ensure Ollama server is running:
curl http://localhost:11434/api/version
- Verify model is available:
ollama list
Tool Execution Issues:
- Check file permissions for file operations
- Verify shell commands aren't blocked by security filters
Debug Mode
Set environment variable for verbose output:
G_MESSAGES_DEBUG=all ./elelem
📊 Features
✅ Implemented
🚧 Planned
📄 License
GNU Affero General Public License v3
Need help? Open an issue or check the conversation history with /history
to see example interactions.