Crate inference_gateway_sdk

Crate inference_gateway_sdk 

Source
Expand description

Inference Gateway SDK for Rust

This crate provides a Rust client for the Inference Gateway API, allowing interaction with various LLM providers through a unified interface.

Structs§

ChatCompletionChoice
ChatCompletionMessageToolCall
A tool call in a message response
ChatCompletionMessageToolCallChunk
A tool call chunk in streaming responses
ChatCompletionMessageToolCallFunction
Function details in a tool call
ChatCompletionStreamChoice
Choice in a streaming completion response
ChatCompletionStreamDelta
Delta content for streaming responses
ChatCompletionTokenLogprob
Token log probability information
ChoiceLogprobs
Log probability information for a choice
CompletionUsage
Usage statistics for the completion request
CreateChatCompletionResponse
The response from generating content
CreateChatCompletionStreamResponse
The response from streaming content generation
FunctionObject
Tool function to call
InferenceGatewayClient
Client for interacting with the Inference Gateway API
ListModelsResponse
Response structure for listing models
ListToolsResponse
Response structure for listing MCP tools
MCPTool
An MCP tool definition
Message
A message in a conversation with an LLM
Model
Common model information
SSEvents
Stream of Server-Sent Events (SSE) from the Inference Gateway API
Tool
Tool to use for the LLM toolbox
TopLogprob
Top log probability entry

Enums§

ChatCompletionToolType
Type of tool that can be called
FinishReason
The reason the model stopped generating tokens
GatewayError
Custom error types for the Inference Gateway SDK
MessageRole
Provider
Supported LLM providers
ToolType
Type of tool that can be used by the model

Traits§

InferenceGatewayAPI
Core API interface for the Inference Gateway