Expand description
§ModelMux - Vertex AI to OpenAI Proxy Library
This crate provides a high-performance proxy server that converts OpenAI-compatible API requests to Vertex AI (Anthropic Claude) format. While primarily designed as a binary application, this library exposes its core functionality for programmatic use.
§Library Usage
use modelmux::{Config, create_app};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Load configuration
let config = Config::load()?;
// Create the application
let app = create_app(config).await?;
// Start server
let listener = tokio::net::TcpListener::bind("0.0.0.0:3000").await?;
axum::serve(listener, app).await?;
Ok(())
}§Modules
config- Configuration management and environment variable handlingprovider- LLM backend abstraction ([LlmProviderBackend]); Vertex and OpenAI-compatible (stub)auth- Request auth (GCP OAuth2 or Bearer token)server- HTTP server setup and route handlersconverter- Format conversion between OpenAI and Anthropic formatserror- Error types and handling
Re-exports§
pub use config::Config;pub use error::ProxyError;
Modules§
- auth
- Authentication for LLM backends (Vertex GCP OAuth2, Bearer token for other providers).
- config
- Professional configuration management for ModelMux.
- converter
- Format conversion modules for OpenAI and Anthropic API compatibility.
- error
- Error handling for the Vertex AI to OpenAI proxy server.
- provider
- LLM provider abstraction for multi-vendor support.
- server
- HTTP server implementation for the Vertex AI to OpenAI proxy.
Functions§
- create_
app - Creates a new ModelMux application with the given configuration.