Ailloy provides a unified interface for interacting with multiple AI providers from Rust — both as a CLI tool for quick tasks and scripting, and as a library for integration into your own projects.
Quick Start
Install the CLI
# Homebrew (macOS/Linux)
# Cargo
# Cargo binstall (pre-built binary)
Configure a provider
Send a message
Use as a Library
Add ailloy to your project without CLI dependencies:
[]
= { = "0.1", = false }
= { = "1", = ["rt-multi-thread", "macros"] }
= "1"
use Config;
use create_provider;
use Message;
async
Or configure a provider directly in code:
use OpenAiClient;
use Message;
async
Providers
| Provider | Kind | Auth | Notes |
|---|---|---|---|
| OpenAI | openai |
API key | GPT-4o, GPT-4, etc. Works with any OpenAI-compatible endpoint |
| Ollama | ollama |
None | Local LLMs (Llama, Mistral, etc.) |
| Local Agent | local-agent |
None | Claude, Codex, Copilot via subprocess |
| Azure OpenAI | azure-openai |
— | Coming soon |
Configuration
Ailloy stores its configuration at ~/.config/ailloy/config.yaml:
default_provider: openai
providers:
openai:
kind: openai
api_key: sk-...
model: gpt-4o
ollama:
kind: ollama
model: llama3.2
claude:
kind: local-agent
binary: claude
CLI Commands
| Command | Description |
|---|---|
ailloy chat <message> |
Send a message to the configured AI provider |
ailloy config init |
Interactive provider setup wizard |
ailloy config show |
Display current configuration |
ailloy providers list |
List configured providers |
ailloy completion <shell> |
Generate shell completions |
ailloy version |
Show version and banner |
Options
Feature Flags
Ailloy uses feature flags to keep the library lean:
| Feature | Default | Description |
|---|---|---|
cli |
Yes | CLI binary and dependencies (clap, inquire, colored, etc.) |
Library users should disable default features:
= { = "0.1", = false }
Development
License
MIT