Crate lua_tokenizer

source
Expand description
let source = " <source code here> ";

let tokenizer = lua_tokenizer::Tokenizer::new(source);
// tokenizer itself is a lazy iterator.
for token in tokenizer {
    match token {
        Ok(token) => {
            // do something with token
        }
        Err(e) => {
            print!("Tokenize Error: {}", e);
        }
    }
}

Structs§

  • range of a token in the source code
  • Token classification and metadata.
  • lazy tokenize iterator.

Enums§

Type Aliases§