Crate llama_core
source ·Expand description
Llama Core, abbreviated as llama-core
, defines a set of APIs. Developers can utilize these APIs to build applications based on large models, such as chatbots, RAG, and more.
Re-exports§
pub use error::LlamaCoreError;
pub use graph::EngineType;
pub use graph::Graph;
pub use graph::GraphBuilder;
Modules§
- Define APIs for audio generation, transcription, and translation.
- Define APIs for chat completion.
- Define APIs for completions.
- Define APIs for computing embeddings.
- Error types for the Llama Core library.
- Define Graph and GraphBuilder APIs for creating a new computation graph.
- Define APIs for image generation and edit.
- Define APIs for querying models.
- Define APIs for RAG operations.
- Define utility functions.
Structs§
- Builder for creating an audio metadata
- Model metadata
- Builder for the
Metadata
struct - Version info of the
wasi-nn_ggml
plugin, including the build number and the commit id.
Enums§
- Running mode
Functions§
- Get the plugin info
- Initialize the core context
- Initialize the piper context
- Initialize the core context for RAG scenarios.
- Initialize the stable diffusion context
- Initialize the whisper context
- Return the current running mode.