Expand description
§rsedn - Rust EDN parser
rsedn is a parser for the EDN data format written in Rust.
§Example
fn main() {
use std::collections::LinkedList;
use rsedn::{
lexer::{source::Source, token::Token},
parser::{self, form::FormKind},
};
// A Source can be created from a &str
let mut source: Source = "(defn add [a b] (+ a b))".into();
// Lex the source into Vec<Lexeme>
let lexemes = source.lex();
// Parse the lexemes into a LinkedList<Token>
let tokens = lexemes
.into_iter()
.map(|lexeme| Token::parse(&source, lexeme)) // Parse the lexeme into a Token
.map(|token| token.unwrap()) // Unwrap the Result<Token, ParsingError>
.collect::<LinkedList<_>>();
let mut token_stream = tokens.iter(); // Create a TokenStream from the LinkedList
let form = parser::parse_form(&mut token_stream).unwrap().unwrap(); // Parse the tokens into a Form
assert!(matches!(form.kind, FormKind::List(_)));
}§Usage
- Take your source code as a
&str - Create a
Sourcefrom the&strusingrsedn::Source::from - Lex the
SourceusingSource::lexthis produces aVec<Lexeme> - Parse each
Lexemeinto aTokenusingToken::parse - Collect the
Tokens into aLinkedList<Token> - Create a
TokenStreamfrom theLinkedList<Token>usingLinkedList::iter - Consume the
TokenStreamusing [parse_tokens] to produce aResult<Option<Form>, ParsingError> - Use the
Sourceand theLexemeto get the span of a givenLexeme
Modules§
- lexer
- This is the lexer module. Source code is read as a
Charsiterator and and it is tokenized intoToken. - parser
- Takes tokens and builds an AST of Forms
Functions§
- consume_
token_ stream - Consumes a
TokenStreamto produce aResult<Option<Form>, ParsingError>The final step of the parsing process - lex_
source - Lexes a
Sourceinto aVec<Lexeme>The second step of the parsing process - parse_
lexeme - Parses a
Lexemeinto aTokenusing theSourceto get the span The third step of the parsing process - source_
from_ str - Produces a
Sourcefrom a&strThe first step of the parsing process