Available on crate feature
models only.Expand description
Model integrations (Gemini, etc.).
Provides LLM implementations:
GeminiModel- Google’s Gemini models
ADK is model-agnostic - implement the Llm trait for other providers.
Available with feature: models
Modules§
- anthropic
anthropic - Anthropic/Claude provider implementation for ADK.
- azure_
ai azure-ai - Azure AI Inference provider for ADK.
- bedrock
bedrock - Amazon Bedrock provider implementation for ADK.
- deepseek
deepseek - DeepSeek provider implementation for ADK.
- gemini
gemini - groq
groq - Groq provider implementation for ADK.
- mock
- ollama
ollama - Ollama local LLM provider implementation for ADK.
- openai
openai - OpenAI provider implementation for ADK.
- openai_
compatible openai - Shared OpenAI-compatible provider implementation.
- openrouter
openrouter - OpenRouter provider implementation for ADK.
- provider
- retry
- usage_
tracking - Stream wrapper that records token usage on the active tracing span.
Structs§
- Anthropic
Client - Anthropic client for Claude models.
- AzureAI
Client - Azure AI Inference client for models hosted on Azure AI endpoints.
- AzureAI
Config - Configuration for Azure AI Inference endpoints.
- Azure
Config - Configuration for Azure OpenAI Service.
- Azure
OpenAI Client - Azure OpenAI client.
- Bedrock
Client - Amazon Bedrock client backed by the AWS SDK Converse API.
- Bedrock
Config - Configuration for Amazon Bedrock.
- Deep
Seek Client - DeepSeek client for deepseek-chat and deepseek-reasoner models.
- Deep
Seek Config - Configuration for DeepSeek API.
- Gemini
Model - Groq
Client - Groq client for ultra-fast LLM inference.
- Groq
Config - Configuration for Groq API.
- MockLlm
- Ollama
Config - Configuration for connecting to an Ollama server.
- Ollama
Model - Ollama client for local LLM inference.
- OpenAI
Client - OpenAI client for standard OpenAI API and OpenAI-compatible APIs.
- OpenAI
Compatible - Shared OpenAI-compatible client implementation.
- OpenAI
Compatible Config - Configuration for OpenAI-compatible providers.
- OpenAI
Config - Configuration for OpenAI API.
- Open
Router Client - Shared OpenRouter client used by the native APIs and the
Llmadapter. - Open
Router Config - OpenRouter configuration shared by native APIs and the
Llmadapter. - Retry
Config - Server
Retry Hint - Hint from the server about when to retry.
Enums§
- Model
Provider - Canonical provider identifiers and metadata shared across ADK crates.
- Open
Router ApiMode - Default API surface used by the
Llmadapter. - Reasoning
Effort - Reasoning effort level for OpenAI reasoning models (e.g., o1, o3).