inference-lab 0.5.0

High-performance LLM inference simulator for analyzing serving systems
Documentation

inference-lab

There is very little structured metadata to build this page from currently. You should check the main library docs, readme, or Cargo.toml in case the author documented the features in them.

This version has 13 feature flags, 7 of them enabled by default.

default

cli (default)

clap (default)

colored (default)

env_logger (default)

minijinja (default)

tabled (default)

tokenizers (default)

axum

serve

tokio

tokio-stream

tower-http

uuid