synkit
Generate syn-like parsing infrastructure from token definitions. Built on logos.
Define tokens once, get: lexer, typed token structs, whitespace-skipping streams, Parse/Peek/ToTokens traits, span tracking, and round-trip formatting.
When to Use
| Use Case | synkit | Alternative |
|---|---|---|
| Custom DSL with formatting | Yes | - |
| Config file parser | Yes | serde + format crate |
| Code transformation | Yes | - |
| Rust source parsing | No | syn |
| Simple pattern matching | No | logos alone |
Installation
[]
= "0.1"
= "0.16"
= "2"
Features: tokio, futures, serde, std (default).
Example
parser_kit!
// Generated: Token enum, EqToken/IdentToken/NumberToken structs,
// TokenStream, Tok![] macro, Parse/Peek/ToTokens/Diagnostic traits
let mut stream = lex?;
let name: = stream.parse?;
let eq: = stream.parse?;
let value: = stream.parse?;
Generated Infrastructure
| Module | Contents |
|---|---|
tokens |
Token enum, *Token structs, Tok![] macro |
stream |
TokenStream with fork/rewind, whitespace skipping |
span |
Span, Spanned<T> wrappers |
traits |
Parse, Peek, ToTokens, Diagnostic |
printer |
Round-trip formatting |
delimiters |
Bracket, Brace, Paren extractors |
Async Streaming
Incremental parsing for network data and large files:
use ;
// Tokens flow through channels, AST nodes emit as parsed
let mut parser = new;
parser.run.await?;