[][src]Trait lrlex::LexerDef

pub trait LexerDef<StorageT> {
    fn from_str(s: &str) -> LexBuildResult<Self>
    where
        Self: Sized
;
fn get_rule(&self, idx: usize) -> Option<&Rule<StorageT>>;
fn get_rule_by_id(&self, tok_id: StorageT) -> &Rule<StorageT>;
fn get_rule_by_name(&self, n: &str) -> Option<&Rule<StorageT>>;
fn set_rule_ids<'a>(
        &'a mut self,
        rule_ids_map: &HashMap<&'a str, StorageT>
    ) -> (Option<HashSet<&'a str>>, Option<HashSet<&'a str>>);
fn iter_rules(&self) -> Iter<'_, Rule<StorageT>>; }

Methods which all lexer definitions must implement.

Required methods

fn from_str(s: &str) -> LexBuildResult<Self> where
    Self: Sized

Instantiate a lexer from a string (e.g. representing a .l file).

fn get_rule(&self, idx: usize) -> Option<&Rule<StorageT>>

Get the Rule at index idx.

fn get_rule_by_id(&self, tok_id: StorageT) -> &Rule<StorageT>

Get the Rule instance associated with a particular lexeme ID. Panics if no such rule exists.

fn get_rule_by_name(&self, n: &str) -> Option<&Rule<StorageT>>

Get the Rule instance associated with a particular name.

fn set_rule_ids<'a>(
    &'a mut self,
    rule_ids_map: &HashMap<&'a str, StorageT>
) -> (Option<HashSet<&'a str>>, Option<HashSet<&'a str>>)

Set the id attribute on rules to the corresponding value in map. This is typically used to synchronise a parser's notion of lexeme IDs with the lexers. While doing this, it keeps track of which lexemes:

  1. are defined in the lexer but not referenced by the parser
  2. and referenced by the parser but not defined in the lexer and returns them as a tuple (Option<HashSet<&str>>, Option<HashSet<&str>>) in the order (defined_in_lexer_missing_from_parser, referenced_in_parser_missing_from_lexer). Since in most cases both sets are expected to be empty, None is returned to avoid a HashSet allocation.

Lexing and parsing can continue if either set is non-empty, so it is up to the caller as to what action they take if either return set is non-empty. A non-empty set #1 is often benign: some lexers deliberately define tokens which are not used (e.g. reserving future keywords). A non-empty set #2 is more likely to be an error since there are parts of the grammar where nothing the user can input will be parseable.

fn iter_rules(&self) -> Iter<'_, Rule<StorageT>>

Returns an iterator over all rules in this AST.

Loading content...

Implementors

impl<StorageT: Copy + Eq + Hash + PrimInt + TryFrom<usize> + Unsigned> LexerDef<StorageT> for LRNonStreamingLexerDef<StorageT>[src]

Loading content...