Struct rcalc::Lexer
[−]
[src]
pub struct Lexer<'a> { /* fields omitted */ }
A lexer (tokenizer) for a mathematical expression.
The Lexer programatically tokenizes one or more bytes of a program, with safety checks to enable notification that the program has concluded (EOF). Each Token is individually useless unless further processed; for instance, by the Parser.
As an example, an abstract representation of a program's tokenization is shown:
in: 123 + 281 ./ 3 ^~~ ^ ^~~ ^~ ^ out (over 5 iterations): 1. INTEGER(123) 2. PLUS 3. INTEGER(281) 4. DIVIDEINT 5. INTEGER(3)
We expect the Parser to be agnostic to the significance and relationship of each byte in the program, instead concerned only with the creation and typing of bytes. As such, the Lexer is built to refrain from analyzing anything but the values of bytes in a program.
Methods
impl<'a> Lexer<'a>
[src]
fn from(text: &str) -> Lexer
[src]
Creates a Lexer from a text program.
fn skip(&mut self, amt: usize) -> &mut Lexer<'a>
[src]
Skips amt
Tokens and returns the current Lexer. Primarily intended for use in method
chaining:
let mut lexer = Lexer::from("2 + 3"); let result = lexer.skip(1).next_token().unwrap(); // => Token::PLUS
fn next_token(&mut self) -> Result<Token, ProgramError>
[src]
Tokenizes the next one or more characters in the program. If no valid Token can be made, the Lexer panics.
let mut lexer = Lexer::from("2 + 3"); let result = lexer.next_token().unwrap(); // => Token::NUMBER(2)