Expand description
Manages interactions with Large Language Models (LLMs), including different providers.
§LLM Module
This module provides the functionality for interacting with Large Language Models (LLMs).
It supports both remote LLMs (like OpenAI) and local LLMs (via llama.cpp).
The LLMClient provides a unified interface for both types of providers.
Structs§
- Choice
- A choice in an LLM response.
- Delta
- The delta of a streamed choice.
- Delta
Function Call - A function call in a streamed delta.
- Delta
Tool Call - A tool call in a streamed delta.
- LLMClient
- A client for interacting with an LLM.
- LLMRequest
- A request to an LLM.
- LLMResponse
- A response from an LLM.
- RemoteLLM
Client - A client for interacting with a remote LLM.
- Stream
Choice - A choice in a streamed response.
- Stream
Chunk - A chunk of a streamed response.
- Usage
- The usage statistics for an LLM response.
Enums§
- LLMProvider
Type - The type of LLM provider to use.
Traits§
- LLMProvider
- A trait for LLM providers.