Mermaid
An open-source AI coding assistant for the terminal. Local/cloud models with Ollama, native tool calling, and a clean TUI.
Features
- Local Models: Run 100% locally with Ollama - your code never leaves your machine
- Cloud Models: Run models via Ollama Cloud - use the most powerful models available today with no hardware cost
- Native Tool Calling: Ollama's tool calling API for structured, reliable actions
- True Agency: Read, write, execute commands, and manage git
- Real-time Streaming: See responses as they're generated
- Session Persistence: Conversations auto-save and can be resumed
- Thinking Mode: Toggle extended thinking for complex reasoning (Alt+T)
- Message Queuing: Type while the model generates - messages queue up and send in order
Quick Start
Prerequisites
Install
# From crates.io
# Or from source
Run
# Start fresh session
# Resume last session
# Use specific model
# List available models
Keyboard Shortcuts
| Key | Action |
|---|---|
| Enter | Send message |
| Esc | Stop generation / Clear input |
| Ctrl+C | Quit |
| Alt+T | Toggle thinking mode |
| Up/Down | Scroll chat or navigate history |
| Page Up/Down | Scroll chat |
| Mouse Wheel | Scroll chat |
Commands
Type : followed by a command:
| Command | Description |
|---|---|
:model <name> |
Switch model |
:clear |
Clear chat history |
:save |
Save conversation |
:load |
Load conversation |
:list |
List saved conversations |
:help |
Show all commands |
:quit |
Exit |
Configuration
Config file: ~/.config/mermaid/config.toml
# Last used model (auto-saved)
= "ollama/qwen3:30b"
[]
= "localhost"
= 11434
# cloud_api_key = "your-key" # For Ollama Cloud models
[]
= 100
= 1048576 # 1MB
Available Tools
The model can use these tools autonomously:
| Tool | Description |
|---|---|
read_file |
Read any file (text, PDF, images) |
write_file |
Create or update files |
delete_file |
Delete files (with backup) |
create_directory |
Create directories |
execute_command |
Run shell commands |
git_status |
Check repository status |
git_diff |
View file changes |
git_commit |
Create commits |
web_search |
Search via local Searxng |
Model Compatibility
Any Ollama model with tool calling support works. Models without tool calling can run inference in mermaid, but cannot use tools (Bash, Read, Write, Web Search, etc.)
Ollama Cloud
Access large models via Ollama Cloud:
# Configure API key
# Use cloud models
Web Search (Optional)
Web search requires a local Searxng instance:
Development
License
MIT OR Apache-2.0
Acknowledgments
Built with Ratatui and Ollama. Inspired by Aider and Claude Code.