transcribe-cli-0.0.3 is not a library.
transcribe-cli
transcribe-cli is a Rust command-line transcription pipeline built on Whisper and CTranslate2.
It supports:
- CPU-optimized transcription
- optional NVIDIA CUDA execution
- automatic Whisper model download into
models/ - local files or
http/httpsaudio URLs - streaming transcription modes
- model cleanup commands
Install
From crates.io:
From a local checkout:
With CUDA support:
CUDA_TOOLKIT_ROOT_DIR=/usr/local/cuda
With CUDA + cuDNN support:
CUDA_TOOLKIT_ROOT_DIR=/usr/local/cuda
Usage
Features
cuda: enable CUDA support with dynamic loadingcuda-static: enable static CUDA supportcuda-dynamic-loading: alias for the dynamic CUDA pathcudnn: enable cuDNN on top of CUDA
Notes
- Whisper models are downloaded automatically on first use.
- By default models are stored in
models/next to the executable unless--models-diris set. - Whisper decoding is handled in-project through a local wrapper around CTranslate2
sys::Whisperand Hugging Facetokenizers. cudaandcudnnbuild CTranslate2 from source throughct2rs, so the NVIDIA driver alone is not enough: a CUDA Toolkit install is required.cudnnalso requires the cuDNN development files.ct2rslooks forcuda.hunder$CUDA_TOOLKIT_ROOT_DIR/includeand forcudnn.hpluslibcudnnunder the same CUDA root.--lockedis recommended forcargo installso published installs use the crate's resolved dependency set instead of newer patch releases.