pub fn tokenize(input: &str) -> Result<Vec<Spanned>, LexError>Expand description
Tokenize source text into a sequence of spanned tokens.
This performs two passes:
- Raw tokenization via logos (skips whitespace within lines).
- Layout insertion (converts indentation to virtual tokens).
ยงErrors
Returns an error if the input contains an unrecognized token.