[−][src]Struct cdchunking::Chunker
Chunker object, wraps the rolling hash into a stream-splitting object.
Implementations
impl<I: ChunkerImpl> Chunker<I>
[src]
pub fn new(inner: I) -> Chunker<I>
[src]
Create a Chunker from a specific way of finding chunk boundaries.
pub fn whole_chunks<R: Read>(self, reader: R) -> WholeChunks<R, I>ⓘNotable traits for WholeChunks<R, I>
impl<R: Read, I: ChunkerImpl> Iterator for WholeChunks<R, I> type Item = Result<Vec<u8>>;
[src]
Notable traits for WholeChunks<R, I>
impl<R: Read, I: ChunkerImpl> Iterator for WholeChunks<R, I> type Item = Result<Vec<u8>>;
Iterates on whole chunks from a file, read into new vectors.
pub fn all_chunks<R: Read>(self, reader: R) -> Result<Vec<Vec<u8>>>
[src]
Reads all the chunks at once, in a vector of chunks (also vectors).
This is similar to .whole_chunks().collect()
, but takes care of the IO
errors, returning an error if any of the chunks failed to read.
pub fn stream<R: Read>(self, reader: R) -> ChunkStream<R, I>
[src]
Reads chunks with zero allocations.
This streaming iterator provides you with the chunk from an internal
buffer that gets reused, instead of allowing memory to hold each chunk.
This is very memory efficient, even if reading large chunks from a
large file (you will get chunks in multiple parts). Unfortunately
because the buffer gets reused, you have to use a while loop; Iterator
cannot be implemented.
Example:
let mut chunk_iterator = chunker.stream(reader); while let Some(chunk) = chunk_iterator.read() { let chunk = chunk.unwrap(); match chunk { ChunkInput::Data(d) => { print!("{:?}, ", d); } ChunkInput::End => println!(" end of chunk"), } }
pub fn chunks<R: Read>(self, reader: R) -> ChunkInfoStream<R, I>ⓘNotable traits for ChunkInfoStream<R, I>
impl<R: Read, I: ChunkerImpl> Iterator for ChunkInfoStream<R, I> type Item = Result<ChunkInfo>;
[src]
Notable traits for ChunkInfoStream<R, I>
impl<R: Read, I: ChunkerImpl> Iterator for ChunkInfoStream<R, I> type Item = Result<ChunkInfo>;
Describes the chunks (don't return the data).
This iterator gives you the offset and size of the chunks, but not the
data in them. If you want to iterate on the data in the chunks in an
easy way, use the whole_chunks()
method.
pub fn slices(self, buffer: &[u8]) -> Slices<'_, I>ⓘ
[src]
Iterate on chunks in an in-memory buffer as slices.
If your data is already in memory, you can use this method instead of
whole_chunks()
to get slices referencing the buffer rather than
copying it to new vectors.
pub fn max_size(self, max: usize) -> Chunker<SizeLimited<I>>
[src]
Returns a new Chunker
object that will not go over a size limit.
Note that the inner chunking method IS reset when a chunk boundary is emitted because of the size limit. That means that using a size limit will not only add new boundary, inside of blocks too big, it might cause the boundary after such a one to not happen anymore.
Auto Trait Implementations
impl<I> RefUnwindSafe for Chunker<I> where
I: RefUnwindSafe,
I: RefUnwindSafe,
impl<I> Send for Chunker<I> where
I: Send,
I: Send,
impl<I> Sync for Chunker<I> where
I: Sync,
I: Sync,
impl<I> Unpin for Chunker<I> where
I: Unpin,
I: Unpin,
impl<I> UnwindSafe for Chunker<I> where
I: UnwindSafe,
I: UnwindSafe,
Blanket Implementations
impl<T> Any for T where
T: 'static + ?Sized,
[src]
T: 'static + ?Sized,
impl<T> Borrow<T> for T where
T: ?Sized,
[src]
T: ?Sized,
impl<T> BorrowMut<T> for T where
T: ?Sized,
[src]
T: ?Sized,
fn borrow_mut(&mut self) -> &mut T
[src]
impl<T> From<T> for T
[src]
impl<T, U> Into<U> for T where
U: From<T>,
[src]
U: From<T>,
impl<T, U> TryFrom<U> for T where
U: Into<T>,
[src]
U: Into<T>,
type Error = Infallible
The type returned in the event of a conversion error.
fn try_from(value: U) -> Result<T, <T as TryFrom<U>>::Error>
[src]
impl<T, U> TryInto<U> for T where
U: TryFrom<T>,
[src]
U: TryFrom<T>,