[−][src]Module rust_tokenizers::preprocessing::tokenizer::base_tokenizer
Structs
BaseTokenizer | |
ConsolidatedTokenIterator | ConsolidatedTokenIterator |
Offset | Offset information (in unicode points) to relate a token back to its original input string |
Token | A token that references the original text An owned token |
TokenRef | A token that references the original text |
TokenizedInput |
Enums
Mask | |
TruncationStrategy |
Traits
ConsolidatableTokens | ConsolidatableTokens |
MultiThreadedTokenizer | |
TokenTrait | |
Tokenizer |
Type Definitions
OffsetSize |