Introducing rlama-CLI
RLAMA
A powerful document question-answering tool that connects to your local Ollama models. Create, manage, and interact with RAG systems for all your document needs.
Available for macOS, Linux, and Windows
Document Indexing
Index any document folder for intelligent retrieval and querying.
Multi-Format Support
Support for text, code, PDF, DOCX, and many other document formats.
Local Processing
Process everything locally with Ollama models. No data leaves your machine.
Interactive Sessions
Create interactive RAG sessions to query your document knowledge base.
Easy Management
Simple commands to create, list, and delete your RAG systems.
Developer Friendly
Built with Go and designed for developers and technical users.
# Create a new RAG system named "documentation" using the llama3 model
# and indexing all documents in the ./docs folder
rlama rag llama3 documentation ./docs
# You'll see progress as documents are processed
Processing file: docs/installation.md
Processing file: docs/commands.md
Processing file: docs/troubleshooting.pdf
...
RAG system "documentation" created successfully!
Technical Documentation
Query your project documentation, manuals, and specifications with ease.
Private Knowledge Base
Create secure RAG systems for sensitive documents with full privacy.
Research Assistant
Query research papers, textbooks, and study materials for faster learning.
Command Reference
Create a new RAG system from documents
Start an interactive session with a RAG system
List all available RAG systems
Delete a RAG system
Update RLAMA to the latest version
Display RLAMA version
Troubleshooting
Common issues and their solutions