pub struct Tokenizer<T: Grinder> { /* private fields */ }
Expand description

A grinder that combines character bundles into lexical tokens. This is the last stage of lexical analysis.

Implementations

Create a new bundler.

Trait Implementations

Auto Trait Implementations

Blanket Implementations

Gets the TypeId of self. Read more

Immutably borrows from an owned value. Read more

Mutably borrows from an owned value. Read more

Performs the conversion.

Performs the conversion.

The type returned in the event of a conversion error.

Performs the conversion.

The type returned in the event of a conversion error.

Performs the conversion.