Crate laps

source ·
Expand description

Lexer and parser collections.

With laps, you can build lexers/parsers by just defining tokens/ASTs and deriving Tokenize/Parse trait for them.


Implement a lexer for S-expression:

use laps::prelude::*;

#[derive(Debug, Tokenize)]
enum TokenKind {
 // This token will be skipped.
 /// Parentheses.
 /// Atom.
 /// End-of-file.

And the parser and ASTs (or actually CSTs):

type Token = laps::token::Token<TokenKind>;

token_ast! {
 macro Token<TokenKind> {
   [atom] => { kind: TokenKind::Atom(_), prompt: "atom" },
   [lpr] => { kind: TokenKind::Paren('(') },
   [rpr] => { kind: TokenKind::Paren(')') },
   [eof] => { kind: TokenKind::Eof },

enum Statement {

struct SExp(Token![lpr], Vec<Elem>, Token![rpr]);

enum Elem {

The above implementation is very close in form to the corresponding EBNF representation of the S-expression:

Statement ::= Elem | EOF;
SExp      ::= "(" {Elem} ")";
Elem      ::= ATOM | SExp;

More Examples

See the examples directory, which contains the following examples:

  • sexp: a S-expression parser.
  • calc: a simple expression calculator.
  • json: a simple JSON parser.
  • clike: interpreter for a C-like programming language.

Accelerating Code Completion for IDEs

By default, Cargo does not enable optimizations for procedural macros, which may result in slower code completion if you are using laps to generate lexers. To avoid this, you can add the following configuration to Cargo.toml:

opt-level = 3

You can also try to manually enable/disable parallelization for lexer generation by adding:

#[enable_par(true)] // or #[enable_par(false)]
enum TokenKind {
 // ...


  • Some common predefined AST structures that can be used in parser.
  • Utilities for constructing lexers.
  • Implementations for constructing lexers.
  • Implementations for constructing parsers.
  • A prelude of some common traits and macros (if enabled feature macros) in laps.
  • Reader related implementations for lexers.
  • Span (Span) and error (Error) related implementations.
  • Token (Token) related implementations, including tokenizer (Tokenizer) and token stream (TokenStream).