pub fn lex(source: &str) -> Vec<Token>Expand description
Tokenizes source into a vector of span-based tokens.
Post-processes the Logos output:
- Coalesces consecutive lexer errors into single
Garbagetokens - Splits
StringLiteraltokens into quote + content + quote - Splits
RegexPredicateMatch/RegexPredicateNoMatchinto operator + whitespace + regex