pub struct LexerBuilder<'r, 't, T: 't> { /* private fields */ }
Expand description

Builder struct for Lexer.

Implementations

Create a new LexerBuilder.

Add a new token that matches the regular expression re. This uses the same syntax as the regex crate.

If re gives the longest match, then f is called on the matched string.

  • If f returns Some(tok), emit the token tok.
  • Otherwise, skip this token and emit nothing.
#[derive(Debug, PartialEq, Eq)]
enum Token {
    Num(usize),
    // ...
}

let lexer = regex_lexer::LexerBuilder::new()
    .token(r"[0-9]*", |num| Some(Token::Num(num.parse().unwrap())))
    .token(r"\s+", |_| None) // skip whitespace
    // ...
    .build()?;

assert_eq!(
    lexer.tokens("1 2 3").collect::<Vec<_>>(),
    vec![Token::Num(1), Token::Num(2), Token::Num(3)],
);

If multiple regexes all have the same longest match, then whichever is defined last is given priority.

#[derive(Debug, PartialEq, Eq)]
enum Token<'t> {
    Ident(&'t str),
    Then,
    // ...
}

let lexer = regex_lexer::LexerBuilder::new()
    .token(r"[a-zA-Z_][a-zA-Z0-9_]*", |id| Some(Token::Ident(id)))
    .token(r"then", |_| Some(Token::Then))
    // ...
    .build()?;

assert_eq!(lexer.tokens("then").next(), Some(Token::Then));
assert_eq!(lexer.tokens("then_perish").next(), Some(Token::Ident("then_perish")));

Construct a Lexer which matches these tokens.

Errors

If a regex cannot be compiled, a regex::Error is returned.

Trait Implementations

Shows the matched regexes

Returns the “default value” for a type. Read more

Auto Trait Implementations

Blanket Implementations

Gets the TypeId of self. Read more

Immutably borrows from an owned value. Read more

Mutably borrows from an owned value. Read more

Returns the argument unchanged.

Calls U::from(self).

That is, this conversion is whatever the implementation of From<T> for U chooses to do.

The type returned in the event of a conversion error.

Performs the conversion.

The type returned in the event of a conversion error.

Performs the conversion.