Crate lexer_generator

Source
Expand description

§lexer-generator

This crate is a small scale lexer package which is parsed from JSON

§Example: Basic Tokenizing

Potential code one might use to lex tokens for a calculator

key.json:

{
    "literals": {
        "number": "[0-9]*(\\.[0-9]*){0, 1}",
        "subtract": "-",
        "add": "\\+",
        "divide": "/",
        "multiply": "\\*" 
    },
    "whitespace": "\n| |\r|\t"
}

main.rs:

let json: String = std::fs::read_to_string("key.json").unwrap();
let source: String = String::from("123 + 456 * 789");
 
let mut lexer = Lexer::from(json, source);
// parsing, runtime, whatever one would want to do with their tokens
"123 + 456 * 789" -> Token("number", "123"), Token("add", "*"), Token("number", "456"), Token("multiply", "*"), Token("number", "789") // ignoring line position and the incremental nature of the lexer

Structs§

Lexer
Lexes tokens from source code based on JSON-parsed ruleset
Token
Tokens are parsed from source code, their types are defined by the Lexer’s ruleset

Enums§

ParsingError