Module llm

Module llm 

Source
Expand description

Manages interactions with Large Language Models (LLMs), including different providers.

§LLM Module

This module provides the functionality for interacting with Large Language Models (LLMs). It supports both remote LLMs (like OpenAI) and local LLMs (via llama.cpp). The LLMClient provides a unified interface for both types of providers.

Structs§

Choice
A choice in an LLM response.
Delta
The delta of a streamed choice.
DeltaFunctionCall
A function call in a streamed delta.
DeltaToolCall
A tool call in a streamed delta.
LLMClient
A client for interacting with an LLM.
LLMRequest
A request to an LLM.
LLMResponse
A response from an LLM.
RemoteLLMClient
A client for interacting with a remote LLM.
StreamChoice
A choice in a streamed response.
StreamChunk
A chunk of a streamed response.
Usage
The usage statistics for an LLM response.

Enums§

LLMProviderType
The type of LLM provider to use.

Traits§

LLMProvider
A trait for LLM providers.