Expand description

Tokenizer are in charge of chopping text into a stream of tokens ready for indexing. This is an seperate crate from tantivy, so implementors don’t need to update for each new tantivy version.

To add support for a tokenizer, implement the Tokenizer trait. Checkout the tantivy repo for some examples.

Structs

Simple wrapper of Box<dyn TokenFilter + 'a>.
Simple wrapper of Box<dyn TokenStream + 'a>.
Token

Traits

Trait for the pluggable components of Tokenizers.
TokenStream is the result of the tokenization.
Tokenizer are in charge of splitting text into a stream of token before indexing.