Skip to main content

tokenize

Function tokenize 

Source
pub fn tokenize(input: &str) -> Result<Vec<Spanned>, LexError>
Expand description

Tokenize source text into a sequence of spanned tokens.

This performs two passes:

  1. Raw tokenization via logos (skips whitespace within lines).
  2. Layout insertion (converts indentation to virtual tokens).

ยงErrors

Returns an error if the input contains an unrecognized token.