zeph-llm 0.12.0

LLM provider abstraction with Ollama, Claude, OpenAI, and Candle backends
Documentation

zeph-llm

There is very little structured metadata to build this page from currently. You should check the main library docs, readme, or Cargo.toml in case the author documented the features in them.

This version has 5 feature flags, 0 of them enabled by default.

default

This feature flag does not enable additional features.

candle

cuda

metal

mock

This feature flag does not enable additional features.

stt