Expand description
Splitting the input stream into a sequence of tokens
cmdparse’s parsers do not work on the input string directly. Instead, they operate on the
token stream — an iterator-like sequence of tokens, each representing a text payload or an
attribute name (a token with preceding --). Token stream does not include whitespaces or
comments (substrings beginning from the first octothorp (#) in the input). A token may
include whitespace characters if its content is surrounded by quotation marks (either ' or
").
For example, let’s consider the following string and the sequence of tokens it is going to be parsed into:
let input = r#"send-message --to user@example.com --subject "Hello, \"world\"" # sending an email"#;
let mut token_stream = TokenStream::new(input);
let mut tokens = Vec::new();
while let Some(result) = token_stream.take() {
let (token, stream) = result?;
token_stream = stream;
tokens.push(token);
}
let expected = vec![
Token::Text(RawLexeme::new("send-message")),
Token::Attribute(RawLexeme::new("to")),
Token::Text(RawLexeme::new("user@example.com")),
Token::Attribute(RawLexeme::new("subject")),
Token::Text(RawLexeme::new(r#""Hello, \"world\"""#)),
];
assert_eq!(tokens, expected);Note the following:
- The token stream is represented by the instance of a
TokenStream. It is immutable (takemethod returns another instance representing the remainder of the input stream). - Each
Tokencan be eitherTextorAttribute. All whitespaces and comments are discarded from the stream. - The contents of the token is a
RawLexeme— a thin wrapper around an input slice. EachRawLexemecan be parsed into the intended representation:
let lexeme = RawLexeme::new(r#""Hello, \"world\"""#);
assert_eq!(&lexeme.parse_string(), r#"Hello, "world""#);Structs§
- RawLexeme
- A wrapper type for a slice of the input string corresponding to a single lexeme
- Token
Stream - Representation of the input as a sequence of tokens
- Unbalanced
Parenthesis - An error representing the fact that parenthesis is encountered when trying to take a token from the token stream.
Enums§
- Token
- An item of the token stream
Functions§
- complete_
variants - Computes all possible completions for a
tokenfrom the possible complete token set ofvariants