Skip to main content

tokenize

Function tokenize 

Source
pub fn tokenize(g: &mut Grammar, s: &str) -> Result<Vec<Token>, TokError>
Expand description

Scans the given string and produces a stream of Tokens.

Next step: ast::parse.

The scanner performs maximal munch between all string literals and r``r terminals in the grammar. It also skips whitespace, and comments (as defined in the grammar). The produced tokens use string interning and can be bracket-paired. See the crate root for more details.