curtana 0.0.2

Simplified zero-cost wrapper over llama.cpp powered by lama-cpp-2.
docs.rs failed to build curtana-0.0.2
Please check the build logs for more information.
See Builds for ideas on how to fix a failed build, or Metadata for how to configure docs.rs builds.
If you believe this is docs.rs' fault, open an issue.
Visit the last successful build: curtana-0.1.3

curtana on crates.io curtana on docs.rs curtana is MIT licensed

Simplified, low-overhead wrapper over llama.cpp powered by the llama-cpp-2 crate supporting most .gguf formatted "Chat" and "Embedding" models.

Build and Test

  1. Install cmake and rust on your system.
  2. Download the GGUF models used during testing:
    • wget https://huggingface.co/bartowski/Llama-3.2-3B-Instruct-GGUF/resolve/main/Llama-3.2-3B-Instruct-Q6_K.gguf
    • wget https://huggingface.co/nomic-ai/nomic-embed-text-v1.5-GGUF/resolve/main/nomic-embed-text-v1.5.f16.gguf
  3. Run cargo test
  4. ???
  5. Profit.

License

Copyright © 2025 With Caer, LLC.

Licensed under the MIT license. Refer to the license file for more info.