Expand description
§qjsonrs
A quick JSON tokenizer.
This crate is intended to be used to quickly tokenize a stream of JSON data. It merely emits tokens, it does not parse the JSON into larger structures.
This is useful for extracting data from massive arrays, or quick parsing of JSON objects where you only care about certain keys.
§Examples:
§Simple usage:
use qjsonrs::{
JsonStream,
JsonToken::{
StartObject,
EndObject,
StartArray,
EndArray,
JsKey,
JsNumber
},
JsonTokenIterator
};
let mut stream = JsonStream::from_read(&b"{\"test\": 1, \"arr\": []}"[..], 256)?;
assert_eq!(stream.next()?.unwrap(), StartObject);
assert_eq!(stream.next()?.unwrap(), JsKey("test".into()));
assert_eq!(stream.next()?.unwrap(), JsNumber("1"));
assert_eq!(stream.next()?.unwrap(), JsKey("arr".into()));
assert_eq!(stream.next()?.unwrap(), StartArray);
assert_eq!(stream.next()?.unwrap(), EndArray);
assert_eq!(stream.next()?.unwrap(), EndObject);
assert_eq!(stream.next()?, None);
§Count size of JSON array:
fn array_size(stream: &mut JsonTokenIterator) -> Result<usize, Error> {
assert_eq!(stream.next()?.unwrap(), StartArray);
let mut size = 0;
let mut depth = 0;
loop {
match stream.next()? {
Some(StartObject) => { if depth == 0 {size += 1;} depth += 1; },
Some(EndObject) => { assert!(depth > 0); depth -= 1; },
Some(StartArray) => { if depth == 0 {size += 1;} depth += 1; },
Some(EndArray) => { if depth == 0 {break;} else { depth -= 1; } },
Some(_) => { if depth == 0 {size += 1; } },
None => { panic!("Early termination"); },
}
}
Ok(size)
}
let mut stream = JsonStream::from_read(&b"[1, [2], 3, {\"a\": [4]}, 5, 6]"[..], 256)?;
assert_eq!(array_size(&mut stream)?, 6);
assert_eq!(stream.next()?, None);
Structs§
- A stream of JSON tokens.
- A raw JSON string (with escapes).
Enums§
- The error type for this crate.
- A token from a stream of JSON.
Traits§
- Trait for an iterator over JsonTokens.