llama_cpp 0.3.2

High-level bindings to llama.cpp with a focus on just being really, really easy to use
Documentation

llama_cpp

There is very little structured metadata to build this page from currently. You should check the main library docs, readme, or Cargo.toml in case the author documented the features in them.

This version has 21 feature flags, 7 of them enabled by default.

default

compat (default)

native (default)

accel (default)

avx (default)

avx2 (default)

f16c (default)

fma (default)

avx512

avx512_vmbi

avx512_vnni

blas

clblast

cuda

cuda_dmmv

cuda_f16

cuda_mmq

hipblas

metal

mpi

sys_verbosity

This feature flag does not enable additional features.

vulkan