tokenize

Function tokenize 

Source
pub fn tokenize<I, L>(iter: I, lexer: L) -> Tokenized<Peekable<I>, L> 
where I: Iterator<Item = char>,
Expand description

Split a stream of characters into tokens separated by whitespace. Comments are ignored.