[][src]Struct lrlex::LexerBuilder

pub struct LexerBuilder<StorageT = u32> { /* fields omitted */ }

A LexerBuilder allows one to specify the criteria for building a statically generated lexer.

Methods

impl<StorageT> LexerBuilder<StorageT> where
    StorageT: Copy + Debug + Eq + Hash + PrimInt + TryFrom<usize> + TypeName + Unsigned
[src]

pub fn new() -> Self[src]

Create a new LexerBuilder.

StorageT must be an unsigned integer type (e.g. u8, u16) which is big enough to index all the tokens, rules, and productions in the lexer and less than or equal in size to usize (e.g. on a 64-bit machine u128 would be too big). If you are lexing large files, the additional storage requirements of larger integer types can be noticeable, and in such cases it can be worth specifying a smaller type. StorageT defaults to u32 if unspecified.

Examples

LexerBuilder::<u8>::new()
    .process_file_in_src("grm.l", None)
    .unwrap();

pub fn rule_ids_map(self, rule_ids_map: HashMap<String, StorageT>) -> Self[src]

Set this lexer builder's map of rule IDs to rule_ids_map. By default, lexing rules have arbitrary, but distinct, IDs. Setting the map of rule IDs (from rule names to StorageT) allows users to synchronise a lexer and parser and to check that all rules are used by both parts).

pub fn process_file_in_src(
    self,
    srcp: &str
) -> Result<(Option<HashSet<String>>, Option<HashSet<String>>), Box<dyn Error>>
[src]

Given the filename a/b.l as input, statically compile the file src/a/b.l into a Rust module which can then be imported using lrlex_mod!("a/b.l"). This is a convenience function around process_file which makes it easier to compile .l files stored in a project's src/ directory.

pub fn process_file<P, Q>(
    self,
    inp: P,
    outp: Q
) -> Result<(Option<HashSet<String>>, Option<HashSet<String>>), Box<dyn Error>> where
    P: AsRef<Path>,
    Q: AsRef<Path>, 
[src]

Statically compile the .l file inp into Rust, placing the output into the file outp. The latter defines a module with a function lexerdef(), which returns a LexerDef that can then be used as normal.

pub fn allow_missing_terms_in_lexer(self, allow: bool) -> Self[src]

If passed false, tokens used in the grammar but not defined in the lexer will cause a panic at lexer generation time. Defaults to false.

pub fn allow_missing_tokens_in_parser(self, allow: bool) -> Self[src]

If passed false, tokens defined in the lexer but not used in the grammar will cause a panic at lexer generation time. Defaults to true (since lexers sometimes define tokens such as reserved words, which are intentionally not in the grammar).

Auto Trait Implementations

impl<StorageT> RefUnwindSafe for LexerBuilder<StorageT> where
    StorageT: RefUnwindSafe

impl<StorageT> Send for LexerBuilder<StorageT> where
    StorageT: Send

impl<StorageT> Sync for LexerBuilder<StorageT> where
    StorageT: Sync

impl<StorageT> Unpin for LexerBuilder<StorageT> where
    StorageT: Unpin

impl<StorageT> UnwindSafe for LexerBuilder<StorageT> where
    StorageT: UnwindSafe

Blanket Implementations

impl<T> Any for T where
    T: 'static + ?Sized
[src]

impl<T> Borrow<T> for T where
    T: ?Sized
[src]

impl<T> BorrowMut<T> for T where
    T: ?Sized
[src]

impl<T> From<T> for T[src]

impl<T, U> Into<U> for T where
    U: From<T>, 
[src]

impl<T, U> TryFrom<U> for T where
    U: Into<T>, 
[src]

type Error = Infallible

The type returned in the event of a conversion error.

impl<T, U> TryInto<U> for T where
    U: TryFrom<T>, 
[src]

type Error = <U as TryFrom<T>>::Error

The type returned in the event of a conversion error.

impl<T, U> TryInto<U> for T where
    U: TryFrom<T>, 
[src]

type Err = <U as TryFrom<T>>::Err