Expand description
WAGon Lexers
Provides lexers for the WAGon DSL, as well as helper iterators which can switch between lexers on the fly.
Most likely, all you will care about are LexerBridge
and Tokens
.
§Example
let s = r#"
meta: "data";
============
S -> A;
"#;
use wagon_lexer::{Tokens, LexerBridge, LexResult};
use wagon_ident::Ident;
let lexer = LexerBridge::new(s);
let tokens: Vec<LexResult> = lexer.collect();
assert_eq!(tokens, vec![
Ok(Tokens::MetadataToken(Metadata::Identifier("meta".into()))),
Ok(Tokens::MetadataToken(Metadata::Colon)),
Ok(Tokens::MathToken(Math::LitString("data".to_string()))),
Ok(Tokens::MathToken(Math::Semi)),
Ok(Tokens::MetadataToken(Metadata::Delim)),
Ok(Tokens::ProductionToken(Productions::Identifier(Ident::Unknown("S".to_string())))),
Ok(Tokens::ProductionToken(Productions::Produce)),
Ok(Tokens::ProductionToken(Productions::Identifier(Ident::Unknown("A".to_string())))),
Ok(Tokens::ProductionToken(Productions::Semi))
])
Modules§
- math
- The lexer for the Math DSL
- metadata
- The Lexer for the Metadata
- productions
- The Lexer for the Grammar DSL
Structs§
- Lexer
Bridge - A struct which automatically switches between the different lexers based on context.
Enums§
- Lexing
Error - An Enum for any errors that may occur during lexing.
- Tokens
- An enum that holds the different types of tokens for the different lexers.
Type Aliases§
- LexResult
- The result of each lex step is either a token or an error.