Crate ferric_lexer

source ·
Expand description

Lexer Module

The lexer’s purpose is to tokenize the input. Tokenization is the process of converting a sequence of characters into a sequence of tokens. These tokens are then used by the parser to generate an Abstract Syntax Tree (AST).

The lexer uses a combination of enumerations and macros to identify and categorize the different parts of the input.

Path: src/lexer/mod.rs

Structs

  • The Lexer struct holds the input string and provides the functionality to tokenize it.

Enums

  • Represents built-in functions provided by the language.
  • Represents reserved keywords in the language.
  • Represents operators in the language.
  • Represents the different types of tokens that can be identified by the lexer.