Skip to main content

tokenize

Function tokenize 

Source
pub fn tokenize(source: &str) -> Vec<(Token, Span)>
Expand description

Tokenize source code with location information

This function performs raw tokenization using the logos lexer, returning tokens paired with their source locations. This is the base tokenization step that converts source strings into token streams.

Pipelines and transformations should operate on the token stream produced by this function, not call it directly. The caller (e.g., LexerRegistry implementations) should call this and pass the result to pipelines.