Expand description
Streaming (chunk-based) tokenizer — see crate::streaming::StreamTokenizer.
Streaming (chunk-based) tokenizer.
StreamTokenizer processes input in arbitrary-sized chunks, carrying
state across chunk boundaries. This enables parsing large files or
network streams without loading the entire document into memory.
Structs§
- Stream
Tokenizer - A streaming tokenizer that processes input chunk by chunk.