| Crates.io | duplicate_file_finder |
| lib.rs | duplicate_file_finder |
| version | 0.1.5 |
| created_at | 2025-07-07 21:55:32.733153+00 |
| updated_at | 2025-07-11 23:08:08.061172+00 |
| description | Finds duplicate files. |
| homepage | https://github.com/Andrewsimsd/duplicate-file-finder |
| repository | https://github.com/Andrewsimsd/duplicate-file-finder |
| max_upload_size | |
| id | 1741823 |
| size | 1,941,226 |
A fast, parallelized CLI tool and library for detecting duplicate files by content. Designed for efficiency, usability, and cross-platform compatibility.
twox-hash)rayon for high performanceAdd to your project:
[dependencies]
duplicate_file_finder = "0.1"
Or install the CLI binary:
cargo install duplicate_file_finder
duplicate_file_finder [--output <file_or_directory>]
duplicate_file_finder <directory> [--output <file_or_directory>]
duplicate_file_finder --directories <dir1> <dir2> ... [--output <file_or_directory>]
duplicate_file_finder ~/Documents --output reports/
This scans ~/Documents and writes a human-readable report to reports/duplicate_file_report.txt.
Running duplicate_file_finder with no arguments scans the directory it is executed from and saves duplicate_file_report.txt in that same directory.
| Option | Description |
|---|---|
-h, --help |
Show help message |
--output <path> |
Specify output file or directory for the report |
-d, --directories <DIR> |
Scan multiple directories as a single pool |
If the output path is a directory, the report is saved as duplicate_file_report.txt within that directory.
Duplicate File Finder Report
Generated by: alice
Start Time: 20250707 15:00:00
End Time: 20250707 15:00:42
Base Directory: /home/alice/Documents
Total Potential Space Savings: 1.43 GB
Size: 143.21 MB
/home/alice/Documents/archive/copy1.iso
/home/alice/Documents/archive/copy2.iso
You can also integrate the crate into your own Rust projects:
use duplicate_file_finder::{find_duplicates, write_output, setup_logger};
use std::path::Path;
fn main() -> Result<(), Box<dyn std::error::Error>> {
setup_logger()?;
let base_dir = Path::new("/some/path");
let duplicates = find_duplicates(base_dir);
write_output(duplicates, "report.txt", "20250707 15:00:00", &[base_dir.to_path_buf()])?;
Ok(())
}
Logs are written to duplicate_finder.log and include timestamps and severity levels.
The tool is optimized for performance using:
rayoncargo test
cargo build --release
This project is licensed under the MIT License. See LICENSE for details.
Contributions, issues, and feature requests are welcome!
git checkout -b feature/awesome)git commit -am 'Add awesome feature')git push origin feature/awesome)Made with ❤️ by Andrew Sims