Skip to main content

Module lexer

Module lexer 

Source
Expand description

Lexer (tokenizer) that converts raw structprop text into tokens. Lexer (tokenizer) for the structprop format.

The lexer converts a raw &str into a flat sequence of [Token]s paired with their 1-indexed source line numbers. Comments and insignificant whitespace are stripped. The resulting token stream is consumed by crate::parse().

§Token rules

InputToken produced
=Token::Eq
{Token::Open
}Token::Close
# … \n(discarded)
"…"Token::Term with the quoted content
any other non-whitespace runToken::Term
end of inputToken::Eof

Enums§

Token
A single token produced by the structprop lexer.

Functions§

tokenize
Lex a structprop input string into a flat Vec of Tokens, each paired with its 1-indexed source line number.