llama-cpp-4 0.2.24

llama.cpp bindings for Rust
Documentation

llama-cpp-4

There is very little structured metadata to build this page from currently. You should check the main library docs, readme, or Cargo.toml in case the author documented the features in them.

This version has 14 feature flags, 1 of them enabled by default.

default

openmp (default)

blas

cuda

ggml

This feature flag does not enable additional features.

hip

metal

mtmd

native

opencl

q1

rpc

vulkan

webgpu