Expand description
Streaming decompression API for memory-efficient extraction.
This module provides a streaming API for extracting 7z archives with bounded memory usage. It’s designed for processing large archives that shouldn’t be fully loaded into memory.
§Overview
The streaming API provides:
StreamingArchive: High-level streaming archive readerStreamingConfig: Configuration for memory bounds and behaviorStreamingEntry: Individual entry with streaming data accessEntryIterator: Iterator over archive entriesMemoryTracker: Memory usage monitoring and limitsRandomAccessReader: Random access for non-solid archivesDecoderPool: Stream pooling for efficient solid archive access- Various sinks: For extracting to different Write implementations
§Example
ⓘ
use zesven::streaming::{StreamingArchive, StreamingConfig};
// Open with custom configuration
let config = StreamingConfig::new()
.max_memory_buffer(32 * 1024 * 1024) // 32 MiB
.verify_crc(true);
let mut archive = StreamingArchive::open_with_config(file, password, config)?;
// Process entries one at a time
for entry_result in archive.entries()? {
let mut entry = entry_result?;
if entry.is_directory() {
std::fs::create_dir_all(entry.name())?;
continue;
}
let mut file = std::fs::File::create(entry.name())?;
entry.extract_to(&mut file)?;
}§Solid Archives
7z archives can be “solid”, meaning multiple files are compressed together as a single stream. This achieves better compression but requires sequential decompression.
For solid archives:
- Use
StreamingArchive::entries()for sequential access - Entries must be processed in order
- Skipping an entry still decompresses it (but discards the data)
For non-solid archives:
RandomAccessReaderenables direct access to specific entries- Entries can be accessed in any order
§Memory Management
Use StreamingConfig to control memory usage:
ⓘ
// Low memory configuration
let config = StreamingConfig::low_memory();
// High performance configuration (more memory, better throughput)
let config = StreamingConfig::high_performance();
// Custom configuration
let config = StreamingConfig::new()
.max_memory_buffer(16 * 1024 * 1024)
.read_buffer_size(32 * 1024);Use MemoryTracker for fine-grained allocation control:
ⓘ
let tracker = MemoryTracker::new(64 * 1024 * 1024);
// Allocate with RAII guard
let guard = tracker.allocate(1024)?;
// Memory automatically released when guard drops§Write Sinks
The sink module provides various Write implementations:
BoundedVecSink: In-memory buffer with size limitCrc32Sink: Computes CRC while discarding dataNullSink: Discards data (for skipping)CountingSink: Wraps another writer and counts bytesProgressSink: Wraps another writer with progress callbacksTeeSink: Writes to two destinations simultaneously
Structs§
- Bounded
Reader - A bounded reader that limits the number of bytes read.
- Bounded
VecSink - In-memory Vec sink with size limit.
- Cancellation
Token - Request cancellation of ongoing parallel extraction.
- Chained
Reader - A chained reader that reads from multiple readers in sequence.
- Counting
Sink - Counting sink that wraps another Write and counts bytes.
- Crc32
Sink - Streaming hash computation sink.
- Decoder
Pool - Stream pool for caching decompression streams.
- Entry
Iterator - Iterator that yields archive entries one at a time with streaming decompression.
- Entry
Location - Location of an entry within a solid block.
- Entry
Locator - Entry locator for finding entries by various criteria.
- Extract
AllResult - Result of extracting all entries.
- Memory
Estimate - Memory usage estimate for an operation.
- Memory
Guard - RAII guard that releases memory when dropped.
- Memory
Tracker - Memory usage tracker for streaming operations.
- Null
Sink - Null sink for skip operations.
- Parallel
Extraction Options - Options for parallel extraction.
- Parallel
Extraction Result - Result of parallel extraction.
- Parallel
Folder Extractor - Parallel folder extractor for non-solid archives.
- Pool
Stats - Statistics for pool usage.
- Pooled
Decoder - A decoder borrowed from the pool.
- Progress
Sink - Progress-reporting sink that calls a callback periodically.
- Progressive
Reader - A reader wrapper that tracks read progress.
- Progressive
Reader With Callback - A progressive reader with a callback for progress updates.
- Random
Access Reader - Random access reader for non-solid archives.
- Random
Entry Reader - Reader for a single entry accessed randomly.
- Skipped
Entry - Information about an entry that was skipped during archive parsing.
- Solid
Block Info - Information about a solid block.
- Solid
Block Stream Reader - Sequential reader for solid archive blocks.
- Solid
Entry Locator - Helper to determine entry positions within solid blocks.
- Streaming
Archive - High-level streaming archive reader.
- Streaming
Config - Configuration for streaming decompression with memory bounds.
- Streaming
Entry - Represents a single entry during streaming iteration.
- System
Memory Info - Information about system memory.
- TeeSink
- Tee sink that writes to two writers simultaneously.
- Tracked
Buffer - A tracked allocation that owns a
Vec<u8>.
Enums§
- Compression
Method - Compression method identifier for memory estimation.
- Skip
Reason - The reason why an archive entry was skipped during parsing.
Traits§
- Extract
ToSink - Trait for extracting entries to Write sinks.