parser_kit!() { /* proc-macro */ }Expand description
Generates a complete parser infrastructure from a token specification.
This is the primary macro for creating a synkit parser. It generates:
- Token enum with Logos lexer (
Tok) - Span and Spanned types (
span::Span,span::Spanned<T>) - Token stream with full synkit trait implementations (
TokenStream) - Delimiter types for bracket matching
- Trait implementations:
Parse,Peek,ToTokens, etc.
§Syntax
ⓘ
parser_kit! {
// Required: error type for parse failures
error: MyParseError,
// Optional: tokens to skip during parsing (usually whitespace)
skip_tokens: [Whitespace, Comment],
// Optional: Logos attributes applied to the token enum
#[logos(skip r"[ \t]+")]
// Required: token definitions
tokens: {
// Literal tokens
Plus => "+",
Minus => "-",
// Regex tokens
Number => r"[0-9]+",
Ident => r"[a-zA-Z_][a-zA-Z0-9_]*",
},
// Optional: delimiter pairs for bracket matching
delimiters: {
Paren => (LParen, RParen),
Bracket => (LBracket, RBracket),
},
// Optional: custom derives for span types
span_derives: [serde::Serialize, serde::Deserialize],
// Optional: custom derives for token types
token_derives: [serde::Serialize],
}§Generated Modules and Types
§span module
RawSpan: Simple start/end byte offsetsSpan: Enum withCallSiteandKnown(RawSpan)variantsSpanned<T>: Value with associated span
§tokens module
Tok: Main token enum with Logos deriveSpannedTok: Alias forSpanned<Tok>
§stream module
TokenStream: Main parsing stream implementingsynkit::TokenStream
§traits module
Re-exports of synkit traits for convenience:
Parse,Peek,ToTokens,PrinterSpanLike,SpannedLike,TokenStream
§Token Stream Methods
The generated TokenStream provides:
new(source: &str)- Create from source stringpeek_token()/next()- Read tokens (skipping configured skip_tokens)peek::<T>()- Check if next token matches typeparse::<T>()- Parse a value implementingParsefork()- Create a lookahead copyrewind(pos)- Reset to previous position (clamped to valid range)cursor_span()/last_span()- Get current/last token spansensure_consumed()- Verify no tokens remain
§Example
ⓘ
parser_kit! {
error: CalcError,
skip_tokens: [Whitespace],
tokens: {
Whitespace => r"[ \t\n]+",
Number => r"[0-9]+",
Plus => "+",
Minus => "-",
},
}
fn parse_expr(input: &str) -> Result<i64, CalcError> {
let mut stream = stream::TokenStream::new(input);
// Parse first number
let num: Number = stream.parse()?;
let mut result = num.value.parse::<i64>().unwrap();
// Parse operations
while stream.peek::<Plus>() || stream.peek::<Minus>() {
if stream.peek::<Plus>() {
let _: Plus = stream.parse()?;
let num: Number = stream.parse()?;
result += num.value.parse::<i64>().unwrap();
} else {
let _: Minus = stream.parse()?;
let num: Number = stream.parse()?;
result -= num.value.parse::<i64>().unwrap();
}
}
stream.ensure_consumed()?;
Ok(result)
}§Delimiter Matching
When delimiters is specified, the macro generates types that track
matched pairs of brackets:
ⓘ
parser_kit! {
error: ParseError,
tokens: {
LParen => "(",
RParen => ")",
},
delimiters: {
Paren => (LParen, RParen),
},
}
// Use in parser:
let (open, inner, close) = stream.parse::<Paren<Expr>>()?;