Function tokenize

Source
pub fn tokenize(src: &str) -> Result<Vec<Token>, LexError>
Expand description

Tokenize the entire input and return a vector of tokens. Errors include unterminated strings/escapes, invalid escapes, invalid numbers, and unexpected characters.