Ailloy provides a unified interface for interacting with multiple AI providers from Rust — both as a CLI tool for quick tasks and scripting, and as a library for integration into your own projects.
Quick Start
Install the CLI
# Homebrew (macOS/Linux)
# Cargo
# Cargo binstall (pre-built binary)
Configure a provider
Send a message
Generate an image
Use as a Library
Add ailloy to your project without CLI dependencies:
[]
= { = "0.4", = false }
= { = "1", = ["rt-multi-thread", "macros"] }
= "1"
Async (recommended)
use ;
async
Blocking (sync)
use Client;
use Message;
Programmatic (no config file needed)
use ;
async
Builder pattern
use ;
async
Image generation
use Client;
async
Providers
| Provider | Kind | Chat | Stream | Images | Embeddings | Auth |
|---|---|---|---|---|---|---|
| OpenAI | openai |
yes | yes | DALL-E | yes | API key |
| Anthropic | anthropic |
yes | yes | — | — | API key |
| Azure OpenAI | azure-openai |
yes | yes | yes | yes | API key / az CLI |
| Microsoft Foundry | microsoft-foundry |
yes | yes | — | yes | API key / az CLI |
| Google Vertex AI | vertex-ai |
yes | yes | Imagen | yes | gcloud CLI |
| Ollama | ollama |
yes | yes | — | yes | None |
| LM Studio | openai |
yes | yes | — | — | None |
| Local Agent | local-agent |
yes | yes | — | — | None |
LM Studio uses the OpenAI-compatible API (http://localhost:1234 by default). Local Agent delegates to CLI tools installed on your system: claude, codex, or copilot.
Configuration
Ailloy stores its configuration at ~/.config/ailloy/config.yaml:
nodes:
openai/gpt-4o:
provider: openai
model: gpt-4o
auth:
env: OPENAI_API_KEY
capabilities:
anthropic/claude-sonnet-4-6:
provider: anthropic
model: claude-sonnet-4-6
auth:
env: ANTHROPIC_API_KEY
capabilities:
ollama/llama3.2:
provider: ollama
model: llama3.2
endpoint: http://localhost:11434
capabilities:
lm-studio/qwen3.5:
provider: openai
model: qwen3.5
endpoint: http://localhost:1234
capabilities:
defaults:
chat: openai/gpt-4o
image: openai/gpt-4o
Local project config
Create .ailloy.yaml in your project root to override or add nodes for that project. Local config is merged with global config (nodes and defaults merge; consents are global-only).
CLI Commands
| Command | Description |
|---|---|
ailloy <message> |
Send a message (shorthand for ailloy chat) |
ailloy chat <message> |
Send a message to the configured AI node |
ailloy chat -i |
Interactive conversation mode |
ailloy config |
Interactive node configuration |
ailloy config show |
Display current configuration |
ailloy nodes list |
List configured AI nodes |
ailloy nodes add |
Add a new AI node interactively |
ailloy nodes default <cap> <id> |
Set the default node for a capability |
ailloy discover |
Auto-detect available AI providers and models |
ailloy completion <shell> |
Generate shell completions |
ailloy version |
Show version and banner |
Options
|
Feature Flags
Ailloy uses feature flags to keep the library lean:
| Feature | Default | Description |
|---|---|---|
cli |
Yes | CLI binary and dependencies (clap, inquire, colored, etc.) |
Library users should disable default features:
= { = "0.4", = false }
Development
License
MIT