pub struct StackMaximaLexer {
pub decimal_sep: char,
pub list_sep: char,
pub end_tokens: Vec<char>,
pub pm: bool,
pub case_isensitive_keywords: bool,
pub localised_keywords: HashMap<String, String>,
pub lisp_ids: bool,
/* private fields */
}Expand description
The actual lexer object.
Fields§
§decimal_sep: charDecimal separator. Typically, '.'.
list_sep: charList separator. Typically, ','.
end_tokens: Vec<char>Supported end_tokens. Typically, vec![';','$'].
pm: boolDo we support the #pm#-operator?
case_isensitive_keywords: boolAre keywords case insensitive? Do we convert “tRuE” to “true”?
localised_keywords: HashMap<String, String>Are there mappings for other keywords, in other languages?
lisp_ids: boolShould we identify LISP-identifiers as tokens? Set to false if you work with questions, true if you work with the STACK-backend.
Implementations§
Source§impl StackMaximaLexer
impl StackMaximaLexer
pub fn new(input: String) -> StackMaximaLexer
Sourcepub fn set_source(&mut self, input: String)
pub fn set_source(&mut self, input: String)
Resets the token, position and char buffers of the lexer and set the input string.
Sourcepub fn next_token(&mut self) -> Option<StackMaximaToken>
pub fn next_token(&mut self) -> Option<StackMaximaToken>
Tries to get the next token. Either from the char-buffer or from a token buffer if any have been returned or previous actions have generated multiple tokens.
Sourcepub fn return_token(&mut self, token: StackMaximaToken)
pub fn return_token(&mut self, token: StackMaximaToken)
Allows returning a token to the lexer, e.g., when the parser does its own insertions of virtual tokens.