Expand description
Large Language Model (Serverless AI) APIs
Modules§
- llm
- Provides access to the underlying WIT interface. You should not normally need to use this module: use the re-exports in this module instead. A WASI interface dedicated to performing inferencing for Large Language Models.
Structs§
- Embeddings
Result - The result of generating embeddings.
- Embeddings
Usage - Usage related to an embeddings generation request.
- Inferencing
Params - Inference request parameters
- Inferencing
Result - An inferencing result
- Inferencing
Usage - Usage information related to the inferencing result
Enums§
- Embedding
Model - Model used for generating embeddings
- Error
- The set of errors which may be raised by functions in this interface
- Inferencing
Model - The model use for inferencing
Functions§
- generate_
embeddings - Generate embeddings using the provided model and collection of text
- infer
- Perform inferencing using the provided model and prompt
- infer_
with_ options - Perform inferencing using the provided model, prompt, and options