Crate llama_core
source ·Expand description
Llama Core, abbreviated as llama-core
, defines a set of APIs. Developers can utilize these APIs to build applications based on large models, such as chatbots, RAG, and more.
Re-exports§
pub use error::LlamaCoreError;
Modules§
- Define APIs for chat completion.
- Define APIs for completions.
- Define APIs for computing embeddings.
- Error types for the Llama Core library.
- Define APIs for querying models.
- Define APIs for RAG operations.
- Define utility functions.
Structs§
- Wrapper of the
wasmedge_wasi_nn::Graph
struct - Model metadata
- Builder for the
Metadata
struct - Version info of the
wasi-nn_ggml
plugin, including the build number and the commit id.
Enums§
- Running mode
Functions§
- Get the plugin info
- Initialize the core context
- Initialize the core context for RAG scenarios.
- Initialize the stable diffusion context
- Return the current running mode.