Crate lexington

Source

Modules§

util

Structs§

Any
A matcher which matches any item from a fixed list of items.
Eof
A scanner which matches the end-of-input and injects a token to represent this.
Lexer
============================================================================= Lexer
Many
A matcher which matches one or more occurences of a given item.
Match
A scanner which matches a single item with a given token. This is one of the fundamental building blocks for most scanners.
OneOrMore
A matcher which matches one or more occurences of a given item.
Or
A Matcher which combines two Matchers together, such that it matches if either matches.
ShiftApplyRule
ShiftCloseRule
The given token indicates the end of a compound element. As such, current stack element is popped off the stack and either reduced into the element below, or returned. For example, when parsing an S-expression (i.e. lisp), the token ) signifies the end of a list.
ShiftFirstRule
ShiftOpenRule
The given token indicates the start of a new compound element. As such, a new default element is pushed onto the stack. For example, when parsing an S-expression (i.e. lisp), then upon encountering a ( we start a new empty list.
ShiftReduceParser
The empty shift rule simply fails immediately with an error. The intention is that, starting from this rule, we can build up a complete shift rule.
ShiftSkipRule
The given token should be ignored. For example, if it indicates whitespace or comments, etc.
ShiftTerminalRule
Handle tokens directly using a given parser. For example, a number 123 can be handled directly using parse(), etc.
ShiftUpdateAsRule
ShiftUpdateRule
ShiftUpdateWithRule
Then
Token
A token constitutes a label and identifying region within the original sequence.
TokenStr
A token constitutes a label and identifying region within the original sequence.
Within
A matching which matches any item within a given range.
ZeroOrMore
A matcher which matches zero or more occurences of a given item.

Traits§

Matcher
Responsible for matching a certain pattern against a data stream (e.g. a character stream). This can be used, for example, for lexing an input stream into tokens.
Scanner
Responsible for converting an input sequence (typically from a string slice) into a token. A simple example which might form part of an S-expression parser is:
ShiftReduceRule
A shift-reduce parser processes input one token at a time (currently without backtracking). Tokens are transformed into terms using two rules: shift and reduce. These operate over a stack of (partially complete) terms.