Find duplicate files by hash

fnordomat e2f338b8db v0.2.1: -a 0 standard instead of some arbitrary size (32MiB); clippy 2 gadi atpakaļ
src e2f338b8db v0.2.1: -a 0 standard instead of some arbitrary size (32MiB); clippy 2 gadi atpakaļ
Cargo.lock e2f338b8db v0.2.1: -a 0 standard instead of some arbitrary size (32MiB); clippy 2 gadi atpakaļ
Cargo.toml e2f338b8db v0.2.1: -a 0 standard instead of some arbitrary size (32MiB); clippy 2 gadi atpakaļ
README.md e2f338b8db v0.2.1: -a 0 standard instead of some arbitrary size (32MiB); clippy 2 gadi atpakaļ

README.md

Dupes

Find duplicate files by SHA256.

Dependencies

This program depends on crates: clap, regex, sha2, walkdir, libc, serde, serde_json

Usage

Dupes 0.2.1
fnordomat <GPG:46D46D1246803312401472B5A7427E237B7908CA>
Finds duplicate files (according to SHA256)

USAGE:
    dupes [FLAGS] [OPTIONS]

FLAGS:
    -A, --always-hash            Always include the hash, even if there is only one file of that size (implies -a 0)
    -j, --emit-json              Output in JSON format
    -h, --help                   Prints help information
    -S, --show-non-duplicates    List also files that are unique (automatically true if -D is used)
    -V, --version                Prints version information

OPTIONS:
    -D, --anti_dir <anti_dir>...
            NEGATIVE directory (multiple instances possible) - don't list files that are present in one of the -D
            entries. Use this to find the difference between two sets of files (implies -A and -S)
    -a, --avoid-compare-if-larger <avoid_compare_if_larger_than>
            Compare files of size >= X by size only. Default or -a 0 = unlimited

    -d, --dir <dir>...                                              Base directory (multiple instances possible)
    -e, --exclude-path <exclude_path>...
            Exclude part of path (glob); applies to both -d and -D

    -i, --ignore-smaller-than <ignore_smaller_than>
            Ignore all files smaller than given size (bytes). Default 0