Docs.rs
inference-lab-0.3.0
inference-lab 0.3.0
Docs.rs crate page
MIT
Links
Repository
crates.io
Source
Owners
fergusfinn
Dependencies
clap ^4.5
normal
optional
colored ^2.1
normal
optional
console_error_panic_hook ^0.1
normal
env_logger ^0.11
normal
optional
getrandom ^0.2
normal
js-sys ^0.3
normal
log ^0.4
normal
minijinja ^2.0
normal
optional
ordered-float ^4.0
normal
rand ^0.8
normal
rand_distr ^0.4
normal
serde ^1.0
normal
serde-wasm-bindgen ^0.6
normal
serde_json ^1.0
normal
tabled ^0.16
normal
optional
tokenizers ^0.20
normal
optional
toml ^0.8
normal
wasm-bindgen ^0.2
normal
tempfile ^3.8
dev
Versions
50.78%
of the crate is documented
Go to latest version
Platform
x86_64-unknown-linux-gnu
Feature flags
docs.rs
About docs.rs
Badges
Builds
Metadata
Shorthand URLs
Download
Rustdoc JSON
Build queue
Privacy policy
Rust
Rust website
The Book
Standard Library API Reference
Rust by Example
The Cargo Guide
Clippy Documentation
Module dataset
inference_
lab
0.3.0
Module dataset
Module Items
Structs
Type Aliases
In crate inference_
lab
inference_lab
Module
dataset
Copy item path
Source
Structs
§
Batch
Request
OpenAI Batch API format - JSONL entries
Dataset
Entry
A processed dataset entry ready for simulation
Dataset
Iterator
Iterator over dataset entries, parsing JSON but NOT tokenizing Tokenization happens in batches in the background thread for performance
Dataset
Loader
Dataset loader that provides lazy iteration over entries
Message
Request
Body
Unparsed
Entry
Unparsed entry from dataset (before tokenization)
Type Aliases
§
Batch
Tokenizer
Fn
A batch tokenizer function that takes multiple message arrays and returns multiple token vectors. This is much faster than tokenizing one at a time.
Tokenizer
Fn
A tokenizer function that takes messages and returns tokenized output. This allows different implementations (tiktoken, transformers.js, etc.) to be passed in from the CLI or WASM interface. The tokenizer should apply the appropriate chat template.