[][src]Crate lrpar

lrpar provides a Yacc-compatible parser (where grammars can be generated at compile-time or run-time). It can take in traditional .y files and convert them into an idiomatic Rust parser. More details can be found in the grmtools book; the quickstart guide is a good place to start.

Example

Let's assume we want to statically generate a parser for a simple calculator language (and let's also assume we are able to use lrlex for the lexer). We need to add a build.rs file to our project which tells lrpar to statically compile the lexer and parser files:

use cfgrammar::yacc::YaccKind;
use lrlex::LexerBuilder;
use lrpar::CTParserBuilder;

fn main() -> Result<(), Box<dyn std::error::Error>> {
    let lex_rule_ids_map = CTParserBuilder::new()
        .yacckind(YaccKind::Grmtools)
        .process_file_in_src("calc.y")?;
    LexerBuilder::new()
        .rule_ids_map(lex_rule_ids_map)
        .process_file_in_src("calc.l")?;
    Ok(())
}

where src/calc.l is as follows:

%%
[0-9]+ "INT"
\+ "+"
\* "*"
\( "("
\) ")"
[\t ]+ ;

and src/calc.y is as follows:

%start Expr
%avoid_insert "INT"
%%
Expr -> Result<u64, ()>:
      Term '+' Expr { Ok($1? + $3?) }
    | Term { $1 }
    ;

Term -> Result<u64, ()>:
      Factor '*' Term { Ok($1? * $3?) }
    | Factor { $1 }
    ;

Factor -> Result<u64, ()>:
      '(' Expr ')' { $2 }
    | 'INT'
      {
          let v = $1.map_err(|_| ())?;
          parse_int($lexer.lexeme_str(&v))
      }
    ;
%%
// Any functions here are in scope for all the grammar actions above.

fn parse_int(s: &str) -> Result<u64, ()> {
    match s.parse::<u64>() {
        Ok(val) => Ok(val),
        Err(_) => {
            eprintln!("{} cannot be represented as a u64", s);
            Err(())
        }
    }
}

Because we specified that our Yacc file is in Grmtools format, each rule has a separate Rust type to which all its functions conform (in this case, all the rules have the same type, but that's not a requirement).

A simple src/main.rs is as follows:

use std::io::{self, BufRead, Write};

use lrlex::lrlex_mod;
use lrpar::lrpar_mod;

// Using `lrlex_mod!` brings the lexer for `calc.l` into scope.
lrlex_mod!(calc_l);
// Using `lrpar_mod!` brings the lexer for `calc.l` into scope.
lrpar_mod!(calc_y);

fn main() {
    // We need to get a `LexerDef` for the `calc` language in order that we can lex input.
    let lexerdef = calc_l::lexerdef();
    let stdin = io::stdin();
    loop {
        print!(">>> ");
        io::stdout().flush().ok();
        match stdin.lock().lines().next() {
            Some(Ok(ref l)) => {
                if l.trim().is_empty() {
                    continue;
                }
                // Now we create a lexer with the `lexer` method with which we can lex an input.
                let mut lexer = lexerdef.lexer(l);
                // Pass the lexer to the parser and lex and parse the input.
                let (res, errs) = calc_y::parse(&mut lexer);
                for e in errs {
                    println!("{}", e.pp(&lexer, &calc_y::token_epp));
                }
                match res {
                    Some(Ok(r)) => println!("Result: {}", r),
                    _ => eprintln!("Unable to evaluate expression.")
                }
            }
            _ => break
        }
    }
}

We can now cargo run our project and evaluate simple expressions:

>>> 2 + 3
Result: 5
>>> 2 + 3 * 4
Result: 14
>>> (2 + 3) * 4
Result: 20

lrpar also comes with advanced error recovery built-in:

>>> 2 + + 3
Parsing error at line 1 column 5. Repair sequences found:
   1: Delete +
   2: Insert INT
Result: 5
>>> 2 + 3 3
Parsing error at line 1 column 7. Repair sequences found:
   1: Insert *
   2: Insert +
   3: Delete 3
Result: 11
>>> 2 + 3 4 5
Parsing error at line 1 column 7. Repair sequences found:
   1: Insert *, Delete 4
   2: Insert +, Delete 4
   3: Delete 4, Delete 5
   4: Insert +, Shift 4, Delete 5
   5: Insert +, Shift 4, Insert +
   6: Insert *, Shift 4, Delete 5
   7: Insert *, Shift 4, Insert *
   8: Insert *, Shift 4, Insert +
   9: Insert +, Shift 4, Insert *
Result: 17

Macros

lrpar_mod

A convenience macro for including statically compiled .y files. A file src/x.y which is statically compiled by lrpar can then be used in a crate with lrpar_mod!(x).

Structs

CTParserBuilder

A CTParserBuilder allows one to specify the criteria for building a statically generated parser.

LexError

A Lexing error.

Lexeme

A Lexeme represents a segment of the user's input that conforms to a known type. All lexemes have a starting position in the user's input: lexemes that result from error recovery, however, do not have a length (or, therefore, an end). This allows us to differentiate between lexemes that are always of zero length (which are required in some grammars) from lexemes that result from error recovery (where an error recovery algorithm can know the type that a lexeme should have been, but can't know what its contents should have been).

ParseError

Records a single parse error.

RTParserBuilder

A run-time parser builder.

Enums

LexParseError

A lexing or parsing error. Although the two are quite distinct in terms of what can be reported to users, both can (at least conceptually) occur at any point of the intertwined lexing/parsing process.

Node

A generic parse tree.

ParseRepair

After a parse error is encountered, the parser attempts to find a way of recovering. Each entry in the sequence of repairs is represented by a ParseRepair.

RecoveryKind

What recovery algorithm should be used when a syntax error is encountered?

Traits

Lexer

Roughly speaking, Lexer is an iterator which collectively produces Lexemes, as well as collecting the newlines encountered so that it can later optionally answer queries of the form "what's the line and column number of lexeme L".