Expand description
Download, embed, and run llama.cpp in your Rust projects.
Re-exports§
pub use backend::Backend;
Modules§
- backend
- Pick your preferred compute backends.
Structs§
- Build
- A specific build of
llama-server. - Download
- The download state of a
Server. - Instance
- An active
Serverrunning a specific language model. - Progress
- The progress of an HTTP download.
- Server
- A server instance.
- Settings
- The configurable options of a new
Instance.