Skip to main content

Module proxy

Module proxy 

Source
Expand description

๐ŸŒ LLM Proxy - Unified interface for multiple LLM providers

This module provides a unified proxy for calling various LLMs, including OpenAI, Anthropic, Google Gemini, and local Candle-based models.

โ€œWhy talk to one AI when you can talk to them all?โ€ - The Cheet ๐Ÿ˜บ

Re-exportsยง

pub use claude as anthropic;

Modulesยง

candle
๐Ÿค– Candle Local LLM Provider Implementation
claude
Claude API Client - Comprehensive Anthropic Claude integration via raw reqwest
google
๐Ÿค– Google Gemini Provider Implementation
grok
๐Ÿค– Grok Provider Implementation (X.AI)
memory
๐Ÿง  Memory Proxy - Scoped conversation history for LLMs
oauth
OAuth 2.0 + PKCE framework for proxy providers.
ollama
๐Ÿฆ™ Ollama & LM Studio Provider - Local LLM Auto-Detection
openai
๐Ÿค– OpenAI Provider Implementation
openai_compat
๐Ÿ”„ OpenAI-Compatible Request/Response Types
openrouter
๐ŸŒ OpenRouter Provider Implementation
server
OpenAI-Compatible Proxy Server
token_store
Secure token storage for proxy OAuth providers.
zai
Z.AI Provider Implementation (Zhipu / GLM family)

Structsยง

LlmMessage
๐Ÿ’ฌ A single message in a conversation
LlmProxy
๐Ÿ› ๏ธ Factory for creating LLM providers
LlmRequest
๐Ÿ“ Request structure for LLM completion
LlmResponse
๐Ÿ“ฆ Response structure from LLM completion
LlmUsage
๐Ÿ“Š Token usage statistics

Enumsยง

LlmRole
๐ŸŽญ Roles in a conversation

Traitsยง

LlmProvider
๐Ÿค– Common interface for all LLM providers