Struct passerine::compiler::lex::Lexer [−][src]
pub struct Lexer { /* fields omitted */ }
Expand description
This represents a lexer object.
A lexer takes a source file and lexes it into tokens.
Note that this struct should not be controlled manually,
use the lex
function instead.
Implementations
Run the lexer, generating the entire token stream.
Helper function that returns the remaining source to be lexed as a &str
.
Helper function that Strips leading whitespace. Note that a newline is not leading whitespace, it’s a separator token.
Helper function that expects an exact literal.
Helper function that eats numeric digits, returning how many lead.
Helper function that expects a literal, returning an error otherwise.
Parses a static token, like an operator or a keyword.
Parses a single-line comment, which ignores from “–” until the next newline.
Parses a nestable multi-line comment,
Which begins with -{
and ends with }-
.
Classifies a symbol or a label.
A series of alphanumerics and certain ascii punctuation (see Lexer::is_alpha
).
Can not start with a numeric character.
Classifies a symbol (i.e. variable name).
Classifies a label (i.e. data wrapper). Must start with an uppercase character.
Classifies a pseudokeyword, used in syntax macros.
Must start with a single quote '
.
Matches a number with a decimal point.
Matches a string, converting escapes.
Matches a separator. Note that separators are special, as they’re mostly ignored They’re used to denote lines in functions blocks. A separator is either a newline or semicolon. They’re grouped, so something like ‘;\n’ is only one separator. Although the parser makes no assumptions, there should be only at most one separator between any two non-separator tokens.