pub fn lexer(path: &str) -> Vec<Token>
Expand description

Constructs a vector of tokens. This straight forward lexical scanner is preset to support over 75 tokens. The list of tokens can be found at this sites github page. Usually, this is the main method for generating tokens by passing in a file path to the document you want to perform a lexical scan on. Example

pub use lexical_scanner::*;
pub use enums::*;
let path = "./test/test.txt";
let token_list = lexical_scanner::lexer(path);
println!("{:?}, ", token_list);