A lexer that reads Lua code and produces tokens. The crate provide two different lexers:
FastLexer: skips all the whitespace tokens
FullLexer: produces every tokens
This struct wraps the tokenizer to offer a simple interface to parse a string.
A small struct that contains minimal information about a slice of the input. The token does not own the part of the code that it represents, it only keeps a reference to the original input.
Errors that can happen while tokenizing some input.
The different type of tokens that will be produced by the lexer
A lexer that skips all the information about whitespaces.
A lexer that keeps the information about whitespaces.
When the lexer encounters an error, it returns the error type and the rest of the given input that was not parsed.