Expand description
ress A crate for parsing raw JS into a token stream
The primary interfaces are the function tokenize
and
the struct Scanner
. The Scanner
struct impls Iterator
and the tokenize
function is just a wrapper
around Scanner::collect()
.
The Scanner
will provide a stream of Item
s, and Item
is
has 3 properties a Token
, a Span
, and a
SourceLocation
. The Span
is a representation of where the
Item
exists in the original source while the Token
provides details
about what JavaScript token it represents.
Modules§
Structs§
- Item
- A single token with additional metadata
- JSBuffer
- Manual
Scanner - Manual
State - All of the important state
for the scanner, used to
cache and reset a
Scanner
- Position
- A single character position in the file including the line/column number
- Scanner
- The primary interface of this crate used
to tokenize any JS text into a stream of
Item
s. - Scanner
State - All of the important state
for the scanner, used to
cache and reset a
Scanner
- Source
Location - The start and end position of a token including the line/column number
- Span
- The start and end of a token as the byte index in the original text
- Tokenizer
- This structure will perform the low level
tokenization before the
Scanner
provides additional context
Enums§
- Open
Curly Kind - For keeping track of the nested-ness of templates and blocks
Functions§
- tokenize
- a convince function for collecting a scanner into
a
Vec<Token>