Expand description
cliai is a small Rust library for invoking AI tools through local CLI backends.
It exposes a minimal trait-based interface so applications can work with different AI backends without coupling to a provider SDK or HTTP client.
§Included backends
§Design
This crate shells out to installed executables. That keeps the dependency surface small, but it also means:
- the backend binary must be installed and available on
PATH, unless overridden withwith_bin() - prompts are passed to external processes
- runtime behavior depends on the installed CLI tool
§Example
use cliai::{AiBackend, GenerateRequest, Ollama};
fn main() -> Result<(), Box<dyn std::error::Error>> {
let ai = Ollama::new("llama3.2");
let response = ai.generate(
&GenerateRequest::new("Write a haiku about Rust.")
.with_instructions("Be concise."),
)?;
println!("{}", response.text);
Ok(())
}Re-exports§
pub use backend::Copilot;pub use backend::Ollama;pub use request::GenerateRequest;pub use response::GenerateResponse;
Modules§
Enums§
- AiError
- Error returned by AI backends.
Traits§
- AiBackend
- Common interface implemented by supported AI backends.