docs.rs failed to build ezllama-0.3.1
Please check the build logs for more information.
See Builds for ideas on how to fix a failed build, or Metadata for how to configure docs.rs builds.
If you believe this is docs.rs' fault, open an issue.
Please check the build logs for more information.
See Builds for ideas on how to fix a failed build, or Metadata for how to configure docs.rs builds.
If you believe this is docs.rs' fault, open an issue.
ezllama
An opinionated, simple Rust interface for local LLMs, powered by llama-cpp-2.
Features
- Simple API: Designed for ease of use with a clean, intuitive interface
- Text and Chat Completion: Support for both text and chat completion tasks
- Infinite token generation: Automatically manages the cache for infinite token generation
- Tracing Integration: Built-in logging via the tracing ecosystem
Right now it only supports the basics, but I might add more features in the future as I need them.
You can try out the chatbot from this repo:
Installation
Add ezllama to your Cargo.toml:
[]
= "*"
For GPU acceleration, enable the appropriate feature (if you are using CUDA, go run some errands when compiling):
[]
= { = "*", = ["cuda"] } # For CUDA support
# or
= { = "*", = ["metal"] } # For Metal support (macOS)
# or
= { = "*", = ["vulkan"] } # For Vulkan support
Quick Start
Note: Make sure you grab a GGUF model from Hugging Face or elsewhere.
use ;
use PathBuf;
Advanced Usage
Text completion
// Create a text session for text completion
let mut text_session = model.create_text_session?;
let output = text_session.prompt?.join;
// Continue generating from the existing context
let more_output = text_session.prompt?.join;
System Messages
// Create a chat session with a system message
let mut chat_session = model.create_chat_session_with_system?;
// Or add a system message to an existing session
let mut chat_session = model.create_chat_session?;
chat_session.add_system_message;
// One-shot completion with system message
let response = model.chat_completion_with_system?.join;
Custom Chat Templates
// Create a chat session with a custom template
let template = "{{0_role}}: {{0_content}}\n{{1_role}}: {{1_content}}";
let mut chat_session = model.create_chat_session_with_template?;
License
Licensed under MIT.
Contribution
Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in the work by you shall be dual licensed as above, without any additional terms or conditions.