ailloy 0.4.0

An AI abstraction layer for Rust
Documentation

Ailloy provides a unified interface for interacting with multiple AI providers from Rust — both as a CLI tool for quick tasks and scripting, and as a library for integration into your own projects.

Quick Start

Install the CLI

# Homebrew (macOS/Linux)
brew install mklab-se/tap/ailloy

# Cargo
cargo install ailloy

# Cargo binstall (pre-built binary)
cargo binstall ailloy

Configure a provider

ailloy config

Send a message

ailloy "Explain the Rust borrow checker in one sentence"

Use as a Library

Add ailloy to your project without CLI dependencies:

[dependencies]
ailloy = { version = "0.4", default-features = false }
tokio = { version = "1", features = ["rt-multi-thread", "macros"] }
anyhow = "1"

Async (recommended)

use ailloy::{Client, Message};

#[tokio::main]
async fn main() -> anyhow::Result<()> {
    let client = Client::from_config()?;
    let response = client.chat(&[Message::user("Hello!")]).await?;
    println!("{}", response.content);
    Ok(())
}

Blocking (sync)

use ailloy::blocking::Client;
use ailloy::Message;

fn main() -> anyhow::Result<()> {
    let client = Client::from_config()?;
    let response = client.chat(&[Message::user("Hello!")])?;
    println!("{}", response.content);
    Ok(())
}

Programmatic (no config file needed)

use ailloy::{Client, Message};

#[tokio::main]
async fn main() -> anyhow::Result<()> {
    let client = Client::openai("sk-...", "gpt-4o")?;
    let response = client.chat(&[Message::user("Hello!")]).await?;
    println!("{}", response.content);
    Ok(())
}

Builder pattern

use ailloy::{Client, Message};

#[tokio::main]
async fn main() -> anyhow::Result<()> {
    let client = Client::builder()
        .anthropic()
        .api_key("sk-ant-...")
        .model("claude-sonnet-4-6")
        .build()?;
    let response = client.chat(&[Message::user("Hello!")]).await?;
    println!("{}", response.content);
    Ok(())
}

Providers

Provider Kind Chat Stream Images Embeddings Auth
OpenAI openai yes yes DALL-E yes API key
Anthropic anthropic yes yes API key
Azure OpenAI azure-openai yes yes yes yes API key / az CLI
Microsoft Foundry microsoft-foundry yes yes yes API key / az CLI
Google Vertex AI vertex-ai yes yes Imagen yes gcloud CLI
Ollama ollama yes yes yes None
Local Agent local-agent yes yes None

Configuration

Ailloy stores its configuration at ~/.config/ailloy/config.yaml:

nodes:
  openai/gpt-4o:
    provider: openai
    model: gpt-4o
    auth:
      env: OPENAI_API_KEY
    capabilities: [chat, image]

  anthropic/claude-sonnet-4-6:
    provider: anthropic
    model: claude-sonnet-4-6
    auth:
      env: ANTHROPIC_API_KEY
    capabilities: [chat]

  ollama/llama3.2:
    provider: ollama
    model: llama3.2
    endpoint: http://localhost:11434
    capabilities: [chat]

defaults:
  chat: openai/gpt-4o
  image: openai/gpt-4o

Local project config

Create .ailloy.yaml in your project root to override or add nodes for that project. Local config is merged with global config (nodes and defaults merge; consents are global-only).

CLI Commands

Command Description
ailloy <message> Send a message (shorthand for ailloy chat)
ailloy chat <message> Send a message to the configured AI node
ailloy chat -i Interactive conversation mode
ailloy config Interactive node configuration
ailloy config show Display current configuration
ailloy nodes list List configured AI nodes
ailloy nodes add Add a new AI node interactively
ailloy nodes default <cap> <id> Set the default node for a capability
ailloy discover Auto-detect available AI providers and models
ailloy completion <shell> Generate shell completions
ailloy version Show version and banner

Options

ailloy "message" --node ollama/llama3.2  # Use a specific node
ailloy "message" --system "Be brief"     # Set a system prompt
ailloy "message" --stream                # Stream response tokens
ailloy "message" --max-tokens 100        # Limit response length
ailloy "message" --temperature 0.7       # Control randomness
ailloy "message" -o response.txt         # Save response to file
ailloy "message" -o image.png            # Generate an image
ailloy "message" -o diagram.svg          # Generate SVG via chat
echo "prompt" | ailloy                   # Pipe input via stdin
ailloy -v chat "message"                 # Debug logging
ailloy -q chat "message"                 # Quiet mode

Feature Flags

Ailloy uses feature flags to keep the library lean:

Feature Default Description
cli Yes CLI binary and dependencies (clap, inquire, colored, etc.)

Library users should disable default features:

ailloy = { version = "0.4", default-features = false }

Development

cargo build                              # Build everything
cargo build --no-default-features --lib  # Build library only
cargo test                               # Run tests
cargo clippy -- -D warnings              # Lint (zero warnings)
cargo fmt --all -- --check               # Format check
cargo run -- chat "hello"                # Run the CLI

License

MIT