oli - Open Local Intelligent assistant
oli is an open-source alternative to Claude Code with powerful agentic capabilities for coding assistance. Features:
- A modern hybrid architecture:
- Rust backend for performance and core functionality
- React/Ink frontend for a beautiful, interactive terminal UI
- Support for both cloud APIs (Anthropic Claude Sonnet 3.7 and OpenAI GPT4o) and local LLMs (via Ollama)
- Strong agentic capabilities including file search, edit, and command execution
- Tool use support across all model providers (Anthropic, OpenAI, and Ollama)
⚠️ This project is in a very early stage and is prone to bugs and issues! Please post your issues as you encounter them.
Installation
Using npm
This will install the latest version from npm and handle all dependencies automatically.
Using Homebrew (macOS)
Homebrew will download and install the latest version directly.
From Source
# Clone the repository
# Build both backend and frontend
# Run the hybrid application
Environment Setup
Development Setup
# Install Python dependencies (for pre-commit)
# Install pre-commit hooks
# Run Rust linting and formatting
# Run TypeScript checks in the UI directory
Cloud API Models
For API-based features, set up your environment variables:
# Create a .env file in the project root
# OR
Using Anthropic Claude 3.7 Sonnet (Recommended)
Claude 3.7 Sonnet provides the most reliable and advanced agent capabilities:
- Obtain an API key from Anthropic
- Set the ANTHROPIC_API_KEY environment variable
- Select the "Claude 3.7 Sonnet" model in the UI
This implementation includes:
- Optimized system prompts for Claude 3.7
- JSON schema output formatting for structured responses
- Improved error handling and retry mechanisms
Using Ollama Models
oli supports local models through Ollama:
- Install Ollama if you haven't already
- Start the Ollama server:
- Pull the model you want to use (we recommend models with tool use capabilities):
# Examples of compatible models - Start oli and select the Ollama model from the model selection menu
Note: For best results with tool use and agent capabilities, use models like Qwen 2.5 Coder which support function calling.
Usage
- Start the application:
-
Select a model:
- Cloud models (Claude 3 Sonnet, GPT-4o) for full agent capabilities
- Local models via Ollama (Qwen, Llama, etc.)
-
Make your coding query in the chat interface:
- Ask for file searches
- Request code edits
- Execute shell commands
- Get explanations of code
Architecture
The application uses a hybrid architecture:
┌───────────────┐ ┌───────────────┐
│ React + Ink UI│◄───────┤ Rust Backend │
│ │ JSON │ │
│ - UI │ RPC │ - Agent │
│ - Task Display│ │ - Tool Exec │
│ - Loading │ │ - Code Parse │
└───────────────┘ └───────────────┘
- Rust Backend: Handles agent functionality, tool execution, and API calls
- React/Ink Frontend: Provides a modern, interactive terminal interface with smooth animations
Examples
Here are some example queries to try:
- "Explain the codebase and how to get started"
- "List all files in the project"
- "Summarize the Cargo.toml file"
- "Show me all files that import the 'anyhow' crate"
License
This project is licensed under the Apache 2.0 License - see the LICENSE file for details.
Acknowledgments
- This project is inspired by Claude Code and similar AI assistants
- Uses Anthropic's Claude 3.7 Sonnet model for optimal agent capabilities
- Backend built with Rust for performance and reliability
- Frontend built with React and Ink for a modern terminal UI experience
- Special thanks to the Rust and React communities for excellent libraries and tools