Skip to main content

Module model

Module model 

Source
Available on crate feature models only.
Expand description

Model integrations (Gemini, etc.).

Provides LLM implementations:

ADK is model-agnostic - implement the Llm trait for other providers.

Available with feature: models

Modules§

anthropicanthropic
Anthropic/Claude provider implementation for ADK.
azure_aiazure-ai
Azure AI Inference provider for ADK.
bedrockbedrock
Amazon Bedrock provider implementation for ADK.
deepseekdeepseek
DeepSeek provider implementation for ADK.
geminigemini
groqgroq
Groq provider implementation for ADK.
mock
ollamaollama
Ollama local LLM provider implementation for ADK.
openaiopenai
OpenAI provider implementation for ADK.
openai_compatibleopenai
Shared OpenAI-compatible provider implementation.
openrouteropenrouter
OpenRouter provider implementation for ADK.
provider
retry
usage_tracking
Stream wrapper that records token usage on the active tracing span.

Structs§

AnthropicClient
Anthropic client for Claude models.
AzureAIClient
Azure AI Inference client for models hosted on Azure AI endpoints.
AzureAIConfig
Configuration for Azure AI Inference endpoints.
AzureConfig
Configuration for Azure OpenAI Service.
AzureOpenAIClient
Azure OpenAI client.
BedrockClient
Amazon Bedrock client backed by the AWS SDK Converse API.
BedrockConfig
Configuration for Amazon Bedrock.
DeepSeekClient
DeepSeek client for deepseek-chat and deepseek-reasoner models.
DeepSeekConfig
Configuration for DeepSeek API.
GeminiModel
GroqClient
Groq client for ultra-fast LLM inference.
GroqConfig
Configuration for Groq API.
MockLlm
OllamaConfig
Configuration for connecting to an Ollama server.
OllamaModel
Ollama client for local LLM inference.
OpenAIClient
OpenAI client for standard OpenAI API and OpenAI-compatible APIs.
OpenAICompatible
Shared OpenAI-compatible client implementation.
OpenAICompatibleConfig
Configuration for OpenAI-compatible providers.
OpenAIConfig
Configuration for OpenAI API.
OpenRouterClient
Shared OpenRouter client used by the native APIs and the Llm adapter.
OpenRouterConfig
OpenRouter configuration shared by native APIs and the Llm adapter.
RetryConfig
ServerRetryHint
Hint from the server about when to retry.

Enums§

ModelProvider
Canonical provider identifiers and metadata shared across ADK crates.
OpenRouterApiMode
Default API surface used by the Llm adapter.
ReasoningEffort
Reasoning effort level for OpenAI reasoning models (e.g., o1, o3).