[][src]Crate ress

ress A crate for parsing raw JS into a token stream

The primary interfaces are the function tokenize and the struct Scanner. The Scanner struct impls Iterator and the tokenize function is just a wrapper around Scanner::collect().

The Scanner will provide a stream of Items, and Item is has 3 properties a Token, a Span, and a SourceLocation. The Span is a representation of where the Item exists in the original source while the Token provides details about what JavaScript token it represents.

Modules

error
prelude
tokens

Structs

Item

A single token with additional metadata

Position

A single character position in the file including the line/column number

Scanner

The primary interface of this crate used to tokenize any JS text into a stream of Items.

ScannerState

All of the important state for the scanner, used to cache and reset a Scanner

SourceLocation

The start and end position of a token including the line/column number

Span

The start and end of a token as the byte index in the original text

Tokenizer

This structure will perform the low level tokenization before the Scanner provides additional context

Enums

OpenCurlyKind

For keeping track of the nested-ness of templates and blocks

Functions

tokenize

a convince function for collecting a scanner into a Vec<Token>