Skip to main content

tokenize

Function tokenize 

Source
pub fn tokenize(input: &str) -> Result<Vec<(Token, u32)>>
Expand description

Lex a structprop input string into a flat Vec of Tokens, each paired with its 1-indexed source line number.

Comments (# … \n) and insignificant whitespace (spaces, tabs, carriage returns, and newlines) are discarded. The returned vector always ends with Token::Eof.

§Errors

Returns crate::Error::Parse if the input contains u32::MAX or more newlines (i.e. the file exceeds u32::MAX lines).

§Examples

use serde_structprop::lexer::{tokenize, Token};

let tokens = tokenize("key = value").unwrap();
assert_eq!(tokens, vec![
    (Token::Term("key".into()), 1),
    (Token::Eq, 1),
    (Token::Term("value".into()), 1),
    (Token::Eof, 1),
]);