Module tokenize

Module tokenize 

Source
Expand description

Tokenize and Detokenize API protocol types

These types mirror the SGLang Python implementation for compatibility. See: python/sglang/srt/entrypoints/openai/protocol.py

Structs§

AddTokenizerRequest
Request schema for adding a tokenizer
AddTokenizerResponse
Response schema for adding a tokenizer (async)
DetokenizeRequest
Request schema for the /v1/detokenize endpoint
DetokenizeResponse
Response schema for the /v1/detokenize endpoint
ListTokenizersResponse
Response schema for listing tokenizers
RemoveTokenizerRequest
Request schema for removing a tokenizer
RemoveTokenizerResponse
Response schema for removing a tokenizer
TokenizeRequest
Request schema for the /v1/tokenize endpoint
TokenizeResponse
Response schema for the /v1/tokenize endpoint
TokenizerInfo
Information about a registered tokenizer

Enums§

CountResult
Count result - either single or batch
StringOrArray
String or array of strings (for flexible input)
TextResult
Text result - either single or batch
TokensInput
Token input - either single sequence or batch
TokensResult
Token IDs result - either single or batch