cliai
Small Rust library for invoking AI tools through local CLI backends such as Ollama and GitHub Copilot CLI.
cliai provides a minimal trait-based interface so your applications can swap providers without coupling to a specific vendor SDK or HTTP API.
Features
-
Lightweight crate with no runtime dependencies
-
Unified backend trait
-
Uses installed local CLI tools
-
Current backends:
- Ollama
- GitHub Copilot CLI
-
Simple request / response types
-
Suitable for scripts, developer tooling, and automation
Installation
Add to your Cargo.toml:
[]
= "0.1"
Package Metadata
Use dual licensing in your manifest:
= "MIT OR Apache-2.0"
Requirements
This crate shells out to external binaries. The relevant CLI tool must be installed and available on your PATH, unless you override the executable path with with_bin().
Ollama
Install Ollama and ensure ollama is available on your PATH.
GitHub Copilot CLI
Install the Copilot CLI and ensure the copilot binary is available on your PATH.
Authenticate it first if required by your environment.
Quick Start
Ollama
use ;
Copilot
use ;
Examples
Run packaged examples:
API Overview
Trait
Request
GenerateRequest
Response
GenerateResponse
Backend Configuration
Ollama
let ai = new
.with_model
.with_bin;
Copilot
let ai = new
.with_bin;
Error Handling
Errors are returned as AiError:
- I/O failures
- UTF-8 decoding failures
- External command failures
match ai.generate
Design Notes
This crate intentionally shells out to installed CLI tools instead of embedding provider SDKs. This is useful when:
- You already use local AI CLIs
- You want fewer dependencies
- You prefer OS-level credential handling
- You need portable internal tooling
Security Notes
Prompts are sent to external binaries you execute. Review the trust model, permissions, and data handling of each installed CLI and model provider before use.
Roadmap
Potential future additions:
- Streaming responses
- Structured outputs
- Timeout configuration
- Retries
- Async support
- Additional backends
Contributing
Issues and pull requests are welcome.
License
Licensed under either of:
- MIT License
- Apache License 2.0
at your option.