rai-cli 1.1.3

Run AI instructions directly from your terminal, scripts, and CI/CD pipelines
rai-cli-1.1.3 is not a library.

Install

cargo:

cargo install rai-cli

curl:

curl -sSL https://appmakes.github.io/Rai/install.sh | sh

To uninstall the curl-installed binary:

curl -sSL https://appmakes.github.io/Rai/uninstall.sh | sh

From source:

cargo install --path .

Quick start

1. First-time setup:

rai start

Pick a provider, enter your API key, and choose a default model.

2. Run a prompt:

rai "whois github.com"

3. Pipe input:

ls -a | rai "count all file size"

4. Run a task file:

rai run task.md

5. Auto-approve tool calls:

rai --yes "Clean up feature flags.md"

Usage

rai [OPTIONS] <PROMPT|FILE>
Command Description
rai "prompt" Run an ad-hoc prompt
rai run task.md Run a task file
rai plan task.md Preview task structure before execution
rai create task.md Create a task file interactively
rai start First-time setup wizard
rai config Open configuration menu
rai profile list List configuration profiles
Flag Description
-y, --yes Auto-approve all tool calls
-m, --model <MODEL> Override AI model (e.g. gpt-4o, kimi-k2)
--profile <NAME> Select configuration profile
-s, --silent No follow-up input
--no-tools Disable tool calling
--bill Print API and token usage summary
--detail Show detailed runtime logs
--think Ask provider to show thinking chain
-v, --verbose Debug logging

Configuration

Config files live in ~/.config/rai/:

File Purpose
config.toml Default profile config
config.<profile>.toml Named profile config

API keys are stored in ~/.local/share/rai/credentials (mode 0600). Use --keyring to store in the OS keyring instead.

Supported providers: OpenAI, Anthropic, Google, Poe, xAI, OpenRouter, Ollama, DeepSeek, MiniMax, Kimi, ZAI, Bedrock, and any OpenAI-compatible endpoint.

Documentation

Development

# Build
cargo build

# Run tests
cargo test

# Lint
cargo clippy

Note: On Linux, if using --keyring, install: libdbus-1-dev, pkg-config

License

See LICENSE for details.