Crate duplicate_destroyer

Source
Expand description

Duplicate Destroyer Library

This library provides functionality to find duplicate files and folders.

To search for duplicates in a set of directories, call get_duplicates function. The function goes recursively through all the directories in its input and finds all duplicate files and directories. It then returns the topmost directories and files for which there exists at least one duplicate.

§Example usage

Suppose we have directory structure:

tests/fixtures/
├── A
│   ├── a.txt
│   └── b
│       ├── alpha.txt
│       └── beta.txt
├── B
│   └── A
│       ├── a.txt
│       └── b
│           ├── alpha.txt
│           └── beta.txt
└── C
    ├── a.txt
    ├── b
    │   ├── alpha.txt
    │   └── beta.txt
    └── diff.txt

The get_duplicates function would then return these directories as duplicates {“tests/fixtures/A”, “tests/fixtures/B/A”}:

use duplicate_destroyer::*;

// Create DuDe configuration
let mut config: Config = Default::default();
config.set_minimum_size(0); // Use non-default minimum size (see Config structure for details)

// Create vector of paths to search for duplicates
let input_dirs = vec![OsString::from("tests/fixtures")];

// Get duplicates
let duplicates = duplicate_destroyer::get_duplicates(input_dirs, &config).unwrap();

let expected_paths = [OsString::from("tests/fixtures/A"),
                      OsString::from("tests/fixtures/B/A")];
let expected_output = DuplicateObject::new(8235, HashSet::from(expected_paths));
assert_eq!(duplicates[0], expected_output)

Structs§

Config
Stores all configuration of Duplicate Destroyer
DuplicateObject
Holds data of duplicate groups that are returned by DuDe.
NoProgressIndicator
Implements ProgressIndicator without displaying anything
NoProgressMultiline
Implements ProgressMultiline without displaying anything

Enums§

HashAlgorithm
Hash Algorithm types supported

Traits§

ProgressIndicator
Simple progress indicator trait
ProgressMultiline
Multiline progress indicator trait

Functions§

get_duplicates
Find the largest duplicate directories or files