[][src]Module moore_vhdl_syntax::lexer::tokenizer

Structs

Tokenizer

A grinder that combines character bundles into lexical tokens. This is the last stage of lexical analysis.