MessagePacker - a no-std msgpack implementation
The protocol specification can be found here.
This crate targets simplicity and performance. No dependencies are used, just the standard Rust library.
It will implement Packable and Unpackable for Rust atomic types. The traits can also be implemented manually.
Features
- alloc: Implements the functionality for
Vec,String, and unlocks custom extensions. - derive: Enables
MsgPackerderive convenience macro. - strict: Will panic if there is a protocol violation of the size of a buffer; the maximum allowed size is
u32::MAX. - std: Will implement the
PackableandUnpackableforstdcollections. - serde: Adds support for serde
Non-uniform collections
MessagePack is a language-agnostic format. Dynamically typed languages like Python, JavaScript, and Ruby naturally allow mixed-type collections — for instance, a Python list [0, 1694166331209.0] containing both an integer and a float is perfectly valid. When these values are serialized into MessagePack, the resulting byte stream encodes each element with its own type tag (u64, f64, etc.), producing an array whose elements have heterogeneous types.
Rust's type system does not directly support such collections: a Vec<T> requires a single concrete T. As noted in #18, the native Packable/Unpackable traits cannot deserialize these non-uniform arrays because they rely on a statically known element type at compile time.
The serde feature provides a workaround: deserialize the MessagePack bytes into serde_json::Value, which is a dynamically typed enum that can represent any JSON-compatible value. This will incur performance overhead compared to the native traits, since serde uses a visitor pattern that involves runtime type dispatch and heap allocations for every element.
use serde;
use Value;
// MessagePack bytes encoding a 2-element array: [0_u64, 1694166331209.0_f64]
// This kind of payload is common when receiving data from Python, JS, or other
// dynamically typed languages that don't distinguish collection element types.
let bytes: & = &;
// Deserialize into a dynamic Value — works for any valid MessagePack payload
let value: Value = from_slice.unwrap;
let items = value.as_array.unwrap;
// Each element retains its original type
assert!;
assert!;
If your use case involves only uniform collections (e.g. Vec<u64>), prefer the native Packable/Unpackable traits for zero-overhead deserialization.
Example
use *;
use HashMap;
// boilerplate derives - those aren't required
// this convenience derive macro will implement `Packable` and `Unpackable`
// create an instance of a city.
let city = City ;
// serialize the city into bytes
let mut buf = Vecnew;
let n = city.pack;
println!;
// deserialize the city and assert correctness
let = unpack.unwrap;
println!;
assert_eq!;
Serde
Version 0.5.0 introduces serde support.
use serde;
use ;
let val = json!;
let ser = to_vec;
let des: Value = from_slice.unwrap;
assert_eq!;
While it's important to recognize that serde's performance can be notably slower, this is primarily due to its implementation of a visitor pattern for type serialization, rather than solely relying on the static structure of declarations. However, serde is broadly used and having its support is helpful since a plethora of other libraries will be automatically supported just by having this feature enabled.
For more information, refer to Benchmarks.
Benchmarks
Results obtained with AMD EPYC 7402P 24-Core Processor.
| msgpacker | msgpacker serde | rmps | |
|---|---|---|---|
| pack 1 | 814.26 ns | 146.61 µs | 191.35 µs |
| pack 10 | 7.5772 µs | 905.06 µs | 1.2513 ms |
| pack 1000 | 4.8276 ms | 88.910 ms | 116.45 ms |
| unpack 1 | 1.0011 µs | 164.48 µs | 270.51 µs |
| unpack 10 | 11.017 µs | 1.1243 ms | 1.6636 ms |
| unpack 1000 | 14.546 ms | 112.81 ms | 164.89 ms |
To run the benchmarks:
&&
Note on Github
Although GitHub offers exceptional CI and hosting services virtually for free, its questionable approach towards user sovereignty and privacy is notable. Consequently, I chose to disengage from their infrastructure.
For more information, check Give Up GitHub!