pub enum TokenRule {
IgnoreLiteral(String),
IgnoreRegExp(String),
Literal(String, String),
RegExp(String, String),
}Expand description
Definition for a token rule used by the tokenizer
On each tokenization step, the tokenizer will try to match the input against the token rules. The longest token matched will be the next token produced.
If a token cannot be matched, one character is skipped and added to the unmatchable token list.
Variants§
IgnoreLiteral(String)
Ignore rule matching a literal
IgnoreRegExp(String)
Ignore rule matching a regular expression
Literal(String, String)
Token rule matching a literal
RegExp(String, String)
Token rule matching a regular expression
Trait Implementations§
Auto Trait Implementations§
impl Freeze for TokenRule
impl RefUnwindSafe for TokenRule
impl Send for TokenRule
impl Sync for TokenRule
impl Unpin for TokenRule
impl UnwindSafe for TokenRule
Blanket Implementations§
Source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
Source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Mutably borrows from an owned value. Read more