Expand description
LLM Edge Agent - High-performance LLM Intercepting Proxy
This crate provides the main application logic for the LLM Edge Agent, integrating all layers into a complete end-to-end system:
- Layer 1: HTTP Server (Axum) with auth and rate limiting
- Layer 2: Multi-tier caching (L1 Moka + L2 Redis)
- Layer 2: Intelligent routing with circuit breakers
- Layer 3: Provider adapters (OpenAI, Anthropic)
- Cross-cutting: Observability (Prometheus, OpenTelemetry, Logging)
Re-exports§
pub use integration::check_system_health;pub use integration::initialize_app_state;pub use integration::AppConfig;pub use integration::AppState;pub use proxy::handle_chat_completions;pub use proxy::ChatCompletionRequest;pub use proxy::ChatCompletionResponse;
Modules§
- integration
- Integration module - orchestrates all system components
- proxy
- Proxy request handler