Expand description
Lexer (tokenizer) that converts raw structprop text into tokens. Lexer (tokenizer) for the structprop format.
The lexer converts a raw &str into a flat sequence of [Token]s paired
with their 1-indexed source line numbers. Comments and insignificant
whitespace are stripped. The resulting token stream is consumed by
crate::parse().
§Token rules
| Input | Token produced |
|---|---|
= | Token::Eq |
{ | Token::Open |
} | Token::Close |
# … \n | (discarded) |
"…" | Token::Term with the quoted content |
| any other non-whitespace run | Token::Term |
| end of input | Token::Eof |
Enums§
- Token
- A single token produced by the structprop lexer.