Crate ferric_lexer

Crate ferric_lexer 

Source
Expand description

§Lexer Module

The lexer’s purpose is to tokenize the input. Tokenization is the process of converting a sequence of characters into a sequence of tokens. These tokens are then used by the parser to generate an Abstract Syntax Tree (AST).

The lexer uses a combination of enumerations and macros to identify and categorize the different parts of the input.

Path: src/lexer/mod.rs

Structs§

Lexer
The Lexer struct holds the input string and provides the functionality to tokenize it.

Enums§

BuiltInFunction
Represents built-in functions provided by the language.
Keyword
Represents reserved keywords in the language.
Operator
Represents operators in the language.
Token
Represents the different types of tokens that can be identified by the lexer.