Expand description
Tokenize and Detokenize API protocol types
These types mirror the SGLang Python implementation for compatibility. See: python/sglang/srt/entrypoints/openai/protocol.py
Structs§
- AddTokenizer
Request - Request schema for adding a tokenizer
- AddTokenizer
Response - Response schema for adding a tokenizer (async)
- Detokenize
Request - Request schema for the /v1/detokenize endpoint
- Detokenize
Response - Response schema for the /v1/detokenize endpoint
- List
Tokenizers Response - Response schema for listing tokenizers
- Remove
Tokenizer Request - Request schema for removing a tokenizer
- Remove
Tokenizer Response - Response schema for removing a tokenizer
- Tokenize
Request - Request schema for the /v1/tokenize endpoint
- Tokenize
Response - Response schema for the /v1/tokenize endpoint
- Tokenizer
Info - Information about a registered tokenizer
Enums§
- Count
Result - Count result - either single or batch
- String
OrArray - String or array of strings (for flexible input)
- Text
Result - Text result - either single or batch
- Tokens
Input - Token input - either single sequence or batch
- Tokens
Result - Token IDs result - either single or batch