Expand description
ress A crate for parsing raw JS into a token stream
The primary interfaces are the function tokenize
and
the struct Scanner
. The Scanner
struct impls Iterator
and the tokenize
function is just a wrapper
around Scanner::collect()
.
The Scanner
will provide a stream of Item
s, and Item
is
has 3 properties a Token
, a Span
, and a
SourceLocation
. The Span
is a representation of where the
Item
exists in the original source while the Token
provides details
about what JavaScript token it represents.
Modules§
Structs§
- A single token with additional metadata
- All of the important state for the scanner, used to cache and reset a
Scanner
- A single character position in the file including the line/column number
- The primary interface of this crate used to tokenize any JS text into a stream of
Item
s. - All of the important state for the scanner, used to cache and reset a
Scanner
- The start and end position of a token including the line/column number
- The start and end of a token as the byte index in the original text
- This structure will perform the low level tokenization before the
Scanner
provides additional context
Enums§
- For keeping track of the nested-ness of templates and blocks
Functions§
- a convince function for collecting a scanner into a
Vec<Token>