104. appsdir.FileDedup
— Find duplicate files.¶
This pyFormex app finds duplicate files on your file system(s). When running the app, it shows a FileDialog to let you select a directory path. The it finds all the duplicate files under that directory tree. The sets of identical files are then presented with the option to delete one or more of the duplicate files.
The functions in the app’s module can be used separately to present other use cases. This can conveniently be done as follows:
from pyformex.appsdir.FileDedup import identical_files
for resolved, size, files in identical_files(dir1, dir2, dir3):
if resolved:
do_something_with_duplicate_files(files)
This module uses some ideas from https://discuss.python.org/t/identifying-duplicate-files-where-speed-is-a-concern/44534/15
104.1. Functions defined in module appsdir.FileDedup¶
- appsdir.FileDedup.classify_by_size(files)[source]¶
Classify files by their size
- Parameters:
files (list of path_like) – The list of file paths to be classified
- Returns:
dict – A dict with file size as key and a list of files with that size as value.
- appsdir.FileDedup.refine_chunk3(size, files)[source]¶
Refine lists of equal sized files by comparing small chunks
Generate subsets of files that are candidates for being identical.
- Parameters:
- Returns:
resolved (bool) – If True, the returned files are definitely identical.
size (int) – The file size
files (list of path_like) – A subset of the input files. All files in the subset have the exact same bytes at some small chunks. These files are thus candidates for being identical.
Notes
This is a generator function. It can be iterated until all input files have been processed.
The current implementation reads three chunks of maximum CHUNKSIZE bytes at the start, middle and end of the file. Files with a size not larger than three times the CHUNKSIZE are thus necessarily identical.
- appsdir.FileDedup.refine_hash(size, files, digest='sha256')[source]¶
Generate subsets of equally sized files that are identical.
- Parameters:
- Returns:
resolved (bool) – Always True.
size (int) – The file size
files (list of path_like) – A subset of the input files that have the same hash for their full contents, and can safely be considered identical.
Notes
This is a generator function. It can be iterated until all input files have been processed.
The current implementation uses the ‘SHA256’ hash, which has practically zero chance of collisions. The number of diferent hashes is 2**256 or more than 10**77. Having two files with the same hash is, while theoretically possible, extremely improbable.
- appsdir.FileDedup.identical_files(files, quick=False, verbose=False)[source]¶
Generate lists of identical files
- Parameters:
files (list of path_like) – The list of files to search for identical ones.
quick (bool) – If True, skip the hash refinement. This will run a lot faster but not all results will be confirmed as identical. Many may just be candidates for being identical files. The default (False) resolves all identical files.
- Returns:
resolved (bool) – If True, the returned files are definitely identical.
size (int) – The file size
files (list of path_like) – A subset of the input files. All files in the subset have the exact same bytes at some small chunks. These files are thus candidates for being identical.
Notes
This is a generator function. It can be iterated until all input files have been processed.
- appsdir.FileDedup.listDuplicates(files)[source]¶
Print the sets of identical files to stdout
- Parameters:
files (list of path_like) – The list of file paths to be classified
- appsdir.FileDedup.deduplicate(*paths, quick=False)[source]¶
Deduplicate indentical files.
Search for identical files and offers an option to delete duplicates.
- Parameters:
paths (list of path_like) – One or more paths to collect files for deduplication. If path is a directory, all files below it are added to the list of files. If it is a file, it is added as such. Symlinks are not followed.