Crate rdkafka [−] [src]
rust-rdkafka
Kafka client library for Rust based on librdkafka.
The library
rust-rdkafka
provides a safe Rust interface to librdkafka. It is currently based on librdkafka 0.9.4.
Documentation
Features
The main features provided at the moment are:
- Support for Kafka 0.8.x, 0.9.x and 0.10.x (timestamp support coming soon). For more information about broker compatibility options, check the librdkafka documentation.
- Consume from single or multiple topics.
- Automatic consumer rebalancing.
- Customizable rebalance, with pre and post rebalance callbacks.
- Offset commit.
- Message production.
- Access to cluster metadata (list of topic-partitions, replicas, active brokers etc).
- Access to group metadata (list groups, list members of groups, hostnames etc).
- Access to producer and consumer metrics and statistics.
Client types
rust-rdkafka
provides low level and high level consumers and producers. Low level:
BaseConsumer
: simple wrapper around the librdkafka consumer. It requires to be periodicallypoll()
ed in order to execute callbacks, rebalances and to receive messages.BaseProducer
: simple wrapper around the librdkafka producer. As in the consumer case, the user must callpoll()
periodically to execute delivery callbacks.
High level:
StreamConsumer
: it returns astream
of messages and takes care of polling the consumer internally.FutureProducer
: it returns afuture
that will be completed once the message is delivered to Kafka (or failed).
Warning: the library is under active development and the APIs are likely to change.
Asynchronous data processing with tokio-rs
tokio-rs is a platform for fast processing of asynchronous events in Rust. The interfaces exposed by the StreamConsumer
and the FutureProducer
allow rust-rdkafka users to easily integrate Kafka consumers and producers within the tokio-rs platform, and write asynchronous message processing code. Note that rust-rdkafka can be used without tokio-rs.
To see rust-rdkafka in action with tokio-rs, check out the asynchronous processing example in the examples folder.
Installation
Add this to your Cargo.toml
:
[dependencies]
rdkafka = "^0.9.0"
This crate will compile librdkafka from sources and link it statically to your executable. To compile librdkafka you'll need:
- the GNU toolchain
- GNU
make
pthreads
zlib
libssl-dev
: optional, not included by default (feature:ssl
).libsasl2-dev
: optional, not included by default (feature:sasl
).
To enable ssl and sasl, use the features
field in Cargo.toml
. Example:
[dependencies.rdkafka]
version = "^0.9.0"
features = ["ssl", "sasl"]
Compiling from sources
To compile from sources, you'll have to update the submodule containing librdkafka:
git submodule update --init
and then compile using cargo
, selecting the features that you want. Example:
cargo build --features "ssl sasl"
Examples
You can find examples in the examples
folder. To run them:
cargo run --example <example_name> -- <example_args>
Tests
The unit tests can run without a Kafka broker present:
cargo test --lib
To run the full suite:
cargo test
In this case there is a broker expected to be running on localhost:9092
.
Travis currently only runs the unit tests.
Reexports
pub use message::Message; |
Modules
client |
Common client functionalities. |
config |
Configuration to create a Consumer or Producer. |
consumer |
Base trait and common functionality for all consumers. |
error |
Error manipulations. |
groups |
Group membership API. |
message |
Store and manipulate Kafka messages. |
metadata |
Cluster metadata. |
producer |
Producer implementations. |
statistics | |
topic_partition_list |
A data structure representing topic, partitions and offsets, compatible with the
|
types |
This module contains type aliases for types defined in the auto-generated bindings. |
util |
Utility functions. |