Expand description
The lexical_scanner procceses the user’s input and converts to a vector of 115+ various tokens. Lexical_scanner works using rust Iterator trait. Using Peekable(), the library can safely view and speedily identify character patterns without using regex.
Modules§
- enums
- The lexical_scanner procceses supports over 115+ various tokens. All major punctuation patterns are supported.
- lexer
Functions§
- lexer
- Converts a file content’s to a Vector of Tokens
Input -> path: &str
Return -> Vec
Typically this is the main method for generating tokens by passing in a file path to the document you want to perform a lexical scan on.
Example - lexer_
as_ str - Converts a string to tokens
Input -> text: &str
Return -> Vec
This is comonnly used for debugging and testing.
Example - lexer_
with_ user_ keywords - Converts a string to tokens
Input -> text: &str , user_keywords: Vec<&str>
Return -> Vec
This allows the user to have lexcial_scanner create custom tokens. This makes for the parsing and or ast mode to become manageable. Example