๐ฆ Krafka
A pure Rust, async-native Apache Kafka client designed for high performance, safety, and ease of use.
โจ Features
- ๐ฆ Pure Rust: No librdkafka or C dependencies
- โก Async-native: Built on Tokio for true async I/O
- ๐ Zero unsafe: Safe Rust by default
- ๐ High performance: Zero-copy buffers, inline hot paths, efficient batching, concurrent batch flushing
- ๐ฆ Full protocol support: Kafka protocol with all compression codecs
- ๐ Incremental fetch sessions: KIP-227 fetch sessions for bandwidth-efficient multi-partition consumers
- ๐ TLS/SSL encryption: Using rustls for secure connections
- ๐ SASL authentication: PLAIN, SCRAM-SHA-256/512, OAUTHBEARER mechanisms
- ๐ฏ Transactions: Exactly-once semantics with transactional producer
- โ๏ธ Cloud-native: First-class AWS MSK support including IAM auth
- ๐ก๏ธ Security hardened: Secret zeroization, constant-time auth (
subtle), decompression bomb protection, decode loop bounds (MAX_DECODE_ARRAY_LEN) - ๐ Built-in retry: Exponential backoff with metadata refresh on leader changes
- ๐ Metrics: Lock-free counters/gauges/latency wired into all hot paths
- ๐งช Fuzz tested: cargo-fuzz targets for protocol arrays, record batches, and response decoders
Minimum Broker Version: Krafka requires Apache Kafka 3.9+. Protocol versions older than the Kafka 3.9 baseline have been removed.
๐ Quick Start
Add Krafka to your Cargo.toml:
[]
= "0.5"
= { = "1", = ["full"] }
# For AWS MSK IAM authentication with full SDK support:
# krafka = { version = "0.5", features = ["aws-msk"] }
Producer
use Producer;
use Result;
async
Consumer
use ;
use Result;
use Duration;
async
Admin Client
use ;
use Result;
use Duration;
async
Transactional Producer
For exactly-once semantics across multiple partitions:
use TransactionalProducer;
use Result;
async
Authentication
Connect to secured Kafka clusters with SASL, SCRAM, OAUTHBEARER, or AWS MSK IAM โ available on all client types:
use Producer;
use Consumer;
use AdminClient;
// Producer with SASL/SCRAM-SHA-256
let producer = builder
.bootstrap_servers
.sasl_scram_sha256
.build
.await?;
// Consumer with SASL/PLAIN
let consumer = builder
.bootstrap_servers
.group_id
.sasl_plain
.build
.await?;
// Producer with SASL/OAUTHBEARER
let producer = builder
.bootstrap_servers
.sasl_oauthbearer
.build
.await?;
// Admin with AWS MSK IAM
use AuthConfig;
let auth = aws_msk_iam;
let admin = builder
.bootstrap_servers
.auth
.build
.await?;
๐ฆ Modules
| Module | Description |
|---|---|
producer |
High-throughput message production with batching and compression |
consumer |
Consumer groups with rebalancing, offset management, and static membership |
admin |
Cluster administration (topics, groups, records, configuration, ACLs) |
interceptor |
Producer and consumer interceptor hooks for observability |
protocol |
Kafka wire protocol implementation |
auth |
Authentication (SASL/PLAIN, SASL/SCRAM, SASL/OAUTHBEARER, AWS MSK IAM) |
๐๏ธ Compression
Krafka supports all Kafka compression codecs, individually feature-gated:
use Producer;
use Compression;
let producer = builder
.bootstrap_servers
.compression // Fast compression
.build
.await?;
| Codec | Cargo Feature | Crate | Characteristics |
|---|---|---|---|
Compression::Gzip |
gzip |
flate2 | Best ratio, slower |
Compression::Snappy |
snappy |
snap | Good balance |
Compression::Lz4 |
lz4 |
lz4_flex | Fastest |
Compression::Zstd |
zstd |
zstd | Best modern choice (requires C toolchain) |
All codecs are enabled by default via the compression feature. To select only what you need:
= { = "0.5", = false, = ["lz4", "snappy"] }
โก Performance Tuning
High Throughput Producer
use ;
use Compression;
use Duration;
let producer = builder
.bootstrap_servers
.acks
.compression
.batch_size // 1MB batches
.linger // Allow batching
.build
.await?;
Low Latency Consumer
use Consumer;
use Duration;
let consumer = builder
.bootstrap_servers
.group_id
.fetch_min_bytes
.fetch_max_wait
.build
.await?;
๐ Documentation
Full documentation is available at hupe1980.github.io/krafka
- Getting Started
- Producer Guide
- Consumer Guide
- Admin Client
- Configuration Reference
- Performance Tuning
- Architecture Overview
- Metrics & Observability
- Error Handling
- Interceptors
- Authentication
๐ฎ Examples
Run the examples with:
# Producer example
# Consumer example
# Advanced consumer example (pause/resume, seek, manual commits)
# Admin client example
# Transactional producer example
# Authentication examples (SASL, SCRAM, MSK IAM)
๐ Status
Krafka is feature-complete and production-ready.
Features:
- โ Protocol layer (all message types, compression, ACL messages, transactions, unified versioned encode/decode dispatch, wire-format validation)
- โ Network layer (async connections, pooling, TLS/SSL, IPv6 support)
- โ
Producer (batching with linger timer, partitioning, compression, built-in retry with exponential backoff, metadata refresh on failure, max-in-flight enforcement via semaphore, buffer backpressure via
ProducerConfig::max_block, interceptor hooks, zero-copyBytespipeline) - โ
Consumer (polling, streaming
recv()with error propagation, offset management, auto-commit timer, seek, pause/resume, configurable partition assignment strategy, rebalance listeners, cooperative sticky assignor, static group membership (KIP-345), interceptor hooks, log compaction awareness, batched offset resolution, per-partition retry backoff) - โ Admin Client (topic CRUD, partitions, configuration, ACL management, consumer groups, record deletion, leader epoch queries, automatic API version negotiation)
- โ Authentication (SASL/PLAIN, SASL/SCRAM-SHA-256/512, SASL/OAUTHBEARER, AWS MSK IAM with SDK support)
- โ TLS/SSL encryption (rustls, mTLS support)
- โ Transactions (exactly-once semantics with transactional producer โ full PID/epoch/sequence tracking)
- โ Metrics (counters, gauges, latency tracking โ all wired into producer/consumer hot paths)
- โ Tracing (OpenTelemetry-compatible spans with properly declared fields)
- โ Security hardening (secret zeroization, constant-time comparison, PBKDF2 validation, decompression limits, decode loop bounds)
- โ Fuzz testing (cargo-fuzz targets for protocol decode paths)
๐ค Contributing
Contributions are welcome!
๐ License
Licensed under the MIT License.