Expand description
Inference Gateway SDK for Rust
This crate provides a Rust client for the Inference Gateway API, allowing interaction with various LLM providers through a unified interface.
Structs§
- Chat
Completion Choice - Chat
Completion Message Tool Call - A tool call in a message response
- Chat
Completion Message Tool Call Function - Function details in a tool call
- Chat
Completion Stream Choice - Choice in a streaming completion response
- Chat
Completion Stream Delta - Delta content for streaming responses
- Completion
Usage - Usage statistics for the completion request
- Create
Chat Completion Response - The response from generating content
- Create
Chat Completion Stream Response - The response from streaming content generation
- Function
Object - Tool function to call
- Inference
Gateway Client - Client for interacting with the Inference Gateway API
- List
Models Response - Response structure for listing models
- List
Tools Response - Response structure for listing MCP tools
- MCPTool
- An MCP tool definition
- Message
- A message in a conversation with an LLM
- Model
- Common model information
- SSEvents
- Stream of Server-Sent Events (SSE) from the Inference Gateway API
- Tool
- Tool to use for the LLM toolbox
- Tool
Call Response - A tool call in the response
Enums§
- Chat
Completion Tool Type - Type of tool that can be called
- Gateway
Error - Custom error types for the Inference Gateway SDK
- Message
Role - Provider
- Supported LLM providers
- Tool
Type - Type of tool that can be used by the model
Traits§
- Inference
GatewayAPI - Core API interface for the Inference Gateway