albert-api 0.1.2

Multi-provider LLM client for Albert CLI — bridges Anthropic, OpenAI, Google Gemini, Ollama, XAI and the Ternlang API with unified streaming and auth
Documentation

albert-api

Multi-provider LLM client for Albert CLI — part of the Ternary Intelligence Stack.

Crates.io License: MIT

What this crate provides

A unified async LLM client that routes to any provider without vendor lock-in:

Provider Protocol
Anthropic (Claude) Messages API
OpenAI (GPT-4o, o1) Chat Completions
Google (Gemini) GenerativeLanguage API
XAI (Grok) OpenAI-compatible
HuggingFace Inference API
Ollama Local OpenAI-compatible
Azure OpenAI Chat Completions
AWS Bedrock Messages API

Also handles OAuth token exchange, model discovery (list_remote_models), and streaming via SSE.

Part of the Albert ecosystem

Crate Role
albert-runtime Session, MCP, auth, bash
albert-api This crate — LLM client
albert-commands Slash command library
albert-tools Tool execution layer
albert-compat Manifest extraction harness
albert-cli Binary (albert)