Expand description
This crate implements a queryable, streaming, JSON pull parser.
- Pull parser?: The parser is implemented as iterator that emits tokens.
- Streaming?: The JSON document being parsed is never fully loaded into memory. It is read & validated byte by byte. This makes it ideal for dealing with large JSON documents
- Queryable? You can configure the parser to only emit & allocate tokens for the parts of the input you are interested in.
JSON is expected to conform to RFC 8259. However, newline-delimited JSON and concatenated json formats are also supported.
Input can be anything that implements the Read
trait (e.g. a file, byte
slice, network socket etc..)
§Basic Usage
use jsn::{TokenReader, mask::*, Format};
use std::error::Error;
fn main() -> Result<(), Box<dyn Error>> {
let data = r#"
{
"name": "John Doe",
"age": 43,
"nicknames": [ "joe" ],
"phone": {
"carrier": "Verizon",
"numbers": [ "+44 1234567", "+44 2345678" ]
}
}
{
"name": "Jane Doe",
"age": 32,
"nicknames": [ "J" ],
"phone": {
"carrier": "AT&T",
"numbers": ["+33 38339"]
}
}
"#;
let mask = key("numbers").and(index(0))
.or(key("name"))
.or(key("age"));
let mut iter = TokenReader::new(data.as_bytes())
.with_mask(mask)
.with_format(Format::Concatenated)
.into_iter();
assert_eq!(iter.next().unwrap()?, "John Doe");
assert_eq!(iter.next().unwrap()?, 43);
assert_eq!(iter.next().unwrap()?, "+44 1234567");
assert_eq!(iter.next().unwrap()?, "Jane Doe");
assert_eq!(iter.next().unwrap()?, 32);
assert_eq!(iter.next().unwrap()?, "+33 38339");
assert_eq!(iter.next(), None);
Ok(())
}
A few things to notice and whet your appetite:
- This is new-line delimited JSON
- We only pay for heap allocating the tokens we extracted (i.e. the first index in the “numbers” array and the “name” and “age” values).
- You can compare tokens to native rust types
- Token masks match anywhere in json; Though the top-level value was an object, we used the
index
mask to match a token that was at index 0 in an array nested in the “phones” object.
Modules§
- Token masks are to JSON tokens what bitmasks are to bits.
Structs§
- Result of calling
Tokens::dry_run
- Bundles the reason & location of a parsing error
- Current position in the input JSON input
- A streaming JSON token pull parser
- An iterator over JSON
Tokens
s.
Enums§
- Format of the JSON input
- All the possible reasons parsing json might fail
- A JSON token You can convert a token to a Rust type using
Token::get()
.
Traits§
- A trait for types that can be created from a JSON token