Module rustpython_parser::lexer
source · Expand description
This module takes care of lexing Python source text.
This means source code is scanned and translated into separate tokens. The rules governing what is and is not a valid token are defined in the Python reference guide section on Lexical analysis.
The primary function in this module is lex
, which takes a string slice
and returns an iterator over the tokens in the source code. The tokens are currently returned
as a Result<Spanned, LexicalError>
, where Spanned
is a tuple containing the
start and end TextSize
and a Tok
denoting the token.
Example
use rustpython_parser::{lexer::lex, Tok, Mode, StringKind};
let source = "x = 'RustPython'";
let tokens = lex(source, Mode::Module)
.map(|tok| tok.expect("Failed to lex"))
.collect::<Vec<_>>();
for (token, range) in tokens {
println!(
"{token:?}@{range:?}",
);
}
Structs
- A lexer for Python source code.
- Represents an error that occur during lexing and are returned by the
parse_*
functions in the iterator in the lexer implementation.
Enums
- Represents the different types of errors that can occur during lexing.
Statics
- A map of keywords to their tokens.
Functions
- Create a new lexer from a source string.
- Create a new lexer from a source string, starting at a given location. You probably want to use
lex
instead.
Type Aliases
- The result of lexing a token.
- Contains a Token along with its
range
.