hoosh 1.3.0

AI inference gateway — multi-provider LLM routing, local model serving, speech-to-text, and token budget management
Documentation

hoosh

There is very little structured metadata to build this page from currently. You should check the main library docs, readme, or Cargo.toml in case the author documented the features in them.

This version has 25 feature flags, 14 of them enabled by default.

default

all-providers (default)

hwaccel (default)

anthropic (default)

This feature flag does not enable additional features.

deepseek (default)

This feature flag does not enable additional features.

grok (default)

This feature flag does not enable additional features.

groq (default)

This feature flag does not enable additional features.

llamacpp (default)

This feature flag does not enable additional features.

lmstudio (default)

This feature flag does not enable additional features.

localai (default)

This feature flag does not enable additional features.

mistral (default)

This feature flag does not enable additional features.

ollama (default)

This feature flag does not enable additional features.

openai (default)

This feature flag does not enable additional features.

openrouter (default)

This feature flag does not enable additional features.

synapse (default)

This feature flag does not enable additional features.

dlp

otel

piper

This feature flag does not enable additional features.

sentiment

tools

tools-audit

tools-discovery

tools-events

tools-full

tools-sandbox

whisper