ailloy 0.1.0

An AI abstraction layer for Rust
Documentation

Ailloy provides a unified interface for interacting with multiple AI providers from Rust — both as a CLI tool for quick tasks and scripting, and as a library for integration into your own projects.

Quick Start

Install the CLI

# Homebrew (macOS/Linux)
brew install mklab-se/tap/ailloy

# Cargo
cargo install ailloy

# Cargo binstall (pre-built binary)
cargo binstall ailloy

Configure a provider

ailloy config init

Send a message

ailloy chat "Explain the Rust borrow checker in one sentence"

Use as a Library

Add ailloy to your project without CLI dependencies:

[dependencies]
ailloy = { version = "0.1", default-features = false }
tokio = { version = "1", features = ["rt-multi-thread", "macros"] }
anyhow = "1"
use ailloy::config::Config;
use ailloy::provider::create_provider;
use ailloy::types::Message;

#[tokio::main]
async fn main() -> anyhow::Result<()> {
    let config = Config::load()?;
    let provider = create_provider(&config)?;
    let response = provider.chat(&[Message::user("Hello!")]).await?;
    println!("{}", response.content);
    Ok(())
}

Or configure a provider directly in code:

use ailloy::openai::OpenAiClient;
use ailloy::types::Message;

#[tokio::main]
async fn main() -> anyhow::Result<()> {
    let client = OpenAiClient::new("sk-...", "gpt-4o", None);
    let response = client.chat(&[Message::user("Hello!")]).await?;
    println!("{}", response.content);
    Ok(())
}

Providers

Provider Kind Auth Notes
OpenAI openai API key GPT-4o, GPT-4, etc. Works with any OpenAI-compatible endpoint
Ollama ollama None Local LLMs (Llama, Mistral, etc.)
Local Agent local-agent None Claude, Codex, Copilot via subprocess
Azure OpenAI azure-openai Coming soon

Configuration

Ailloy stores its configuration at ~/.config/ailloy/config.yaml:

default_provider: openai
providers:
  openai:
    kind: openai
    api_key: sk-...
    model: gpt-4o
  ollama:
    kind: ollama
    model: llama3.2
  claude:
    kind: local-agent
    binary: claude

CLI Commands

Command Description
ailloy chat <message> Send a message to the configured AI provider
ailloy config init Interactive provider setup wizard
ailloy config show Display current configuration
ailloy providers list List configured providers
ailloy completion <shell> Generate shell completions
ailloy version Show version and banner

Options

ailloy chat "message" --provider ollama    # Use a specific provider
ailloy chat "message" --system "Be brief"  # Set a system prompt
ailloy -v chat "message"                   # Debug logging
ailloy -q chat "message"                   # Quiet mode

Feature Flags

Ailloy uses feature flags to keep the library lean:

Feature Default Description
cli Yes CLI binary and dependencies (clap, inquire, colored, etc.)

Library users should disable default features:

ailloy = { version = "0.1", default-features = false }

Development

cargo build                              # Build everything
cargo build --no-default-features --lib  # Build library only
cargo test                               # Run tests
cargo clippy -- -D warnings              # Lint (zero warnings)
cargo fmt --all -- --check               # Format check
cargo run -- chat "hello"                # Run the CLI

License

MIT