Expand description
Inference Gateway SDK for Rust
This crate provides a Rust client for the Inference Gateway API, allowing interaction with various LLM providers through a unified interface.
Structs§
- Chat
Completion Choice - Chat
Completion Message Tool Call - A tool call in a message response
- Chat
Completion Message Tool Call Chunk - A tool call chunk in streaming responses
- Chat
Completion Message Tool Call Function - Function details in a tool call
- Chat
Completion Stream Choice - Choice in a streaming completion response
- Chat
Completion Stream Delta - Delta content for streaming responses
- Chat
Completion Token Logprob - Token log probability information
- Choice
Logprobs - Log probability information for a choice
- Completion
Usage - Usage statistics for the completion request
- Create
Chat Completion Response - The response from generating content
- Create
Chat Completion Stream Response - The response from streaming content generation
- Function
Object - Tool function to call
- Inference
Gateway Client - Client for interacting with the Inference Gateway API
- List
Models Response - Response structure for listing models
- List
Tools Response - Response structure for listing MCP tools
- MCPTool
- An MCP tool definition
- Message
- A message in a conversation with an LLM
- Model
- Common model information
- SSEvents
- Stream of Server-Sent Events (SSE) from the Inference Gateway API
- Tool
- Tool to use for the LLM toolbox
- TopLogprob
- Top log probability entry
Enums§
- Chat
Completion Tool Type - Type of tool that can be called
- Finish
Reason - The reason the model stopped generating tokens
- Gateway
Error - Custom error types for the Inference Gateway SDK
- Message
Role - Provider
- Supported LLM providers
- Tool
Type - Type of tool that can be used by the model
Traits§
- Inference
GatewayAPI - Core API interface for the Inference Gateway