pub fn tokenize<I, L>(iter: I, lexer: L) -> Tokenized<Peekable<I>, L> ⓘwhere I: Iterator<Item = char>,
Split a stream of characters into tokens separated by whitespace. Comments are ignored.