Module llm

Module llm 

Source
Expand description

Large Language Model (Serverless AI) APIs

Modules§

llm
Provides access to the underlying WIT interface. You should not normally need to use this module: use the re-exports in this module instead. A WASI interface dedicated to performing inferencing for Large Language Models.

Structs§

EmbeddingsResult
The result of generating embeddings.
EmbeddingsUsage
Usage related to an embeddings generation request.
InferencingParams
Inference request parameters
InferencingResult
An inferencing result
InferencingUsage
Usage information related to the inferencing result

Enums§

EmbeddingModel
Model used for generating embeddings
Error
The set of errors which may be raised by functions in this interface
InferencingModel
The model use for inferencing

Functions§

generate_embeddings
Generate embeddings using the provided model and collection of text
infer
Perform inferencing using the provided model and prompt
infer_with_options
Perform inferencing using the provided model, prompt, and options