Expand description
Universal LLM protocol middleware for OpenAI, Anthropic, Claude Code, and OpenAI-compatible backends.
ferryllm translates client protocol requests into a shared internal
representation, routes them by model name, and translates them into the
selected provider protocol. The crate can be embedded as a Rust library or
run as a standalone HTTP server through the ferryllm binary.
Main modules: