(base) PS X:\github\ComfyUI\models> X:\github\ComfyUI\models\finddupes.ps1
= * 80
ComfyUI Model Duplicate Finder and Manager
= * 80
STEP 1: Scanning for files with identical sizes...
Found 73 model files to check
Checking: clip_l.safetensors (246144152 bytes / 234.74 MB) / File 1 of 73
Checking: flux1-dev.1.t5xxl_fp16..safetensors (9787841024 bytes / 9334.41 MB) / File 2 of 73
Checking: flux1-dev.2.t5xxl_fp8_e4m3fn.safetensors (4893934904 bytes / 4667.22 MB) / File 3 of 73
Checking: flux1-dev.clip_l.safetensors (246144152 bytes / 234.74 MB) / File 4 of 73
Checking: t5xxl_fp16.safetensors (9787841024 bytes / 9334.41 MB) / File 5 of 73
Checking: model.safetensors (1710540580 bytes / 1631.3 MB) / File 6 of 73
Checking: anything_v3.yaml (2006 bytes / 0 MB) / File 7 of 73
Checking: v1-inference.yaml (1943 bytes / 0 MB) / File 8 of 73
Checking: v1-inference_clip_skip_2.yaml (2006 bytes / 0 MB) / File 9 of 73
Checking: v1-inference_clip_skip_2_fp16.yaml (2030 bytes / 0 MB) / File 10 of 73
Checking: v1-inference_fp16.yaml (1967 bytes / 0 MB) / File 11 of 73
Checking: v1-inpainting-inference.yaml (2063 bytes / 0 MB) / File 12 of 73
Checking: v2-inference-v.yaml (1883 bytes / 0 MB) / File 13 of 73
Checking: v2-inference-v_fp32.yaml (1884 bytes / 0 MB) / File 14 of 73
Checking: v2-inference.yaml (1856 bytes / 0 MB) / File 15 of 73
Checking: v2-inference_fp32.yaml (1857 bytes / 0 MB) / File 16 of 73
Checking: v2-inpainting-inference.yaml (4608 bytes / 0 MB) / File 17 of 73
Checking: flux1-dev-kontext_fp8_scaled.safetensors (11904640136 bytes / 11353.15 MB) / File 18 of 73
Checking: flux1-fill-dev.safetensors (23804922408 bytes / 22702.14 MB) / File 19 of 73
Checking: flux1-kontext-dev.safetensors (23802947360 bytes / 22700.26 MB) / File 20 of 73
Checking: detection_Resnet50_Final.pth (109497761 bytes / 104.43 MB) / File 21 of 73
Checking: parsing_parsenet.pth (85331193 bytes / 81.38 MB) / File 22 of 73
Checking: codeformer-v0.1.0.pth (376637898 bytes / 359.19 MB) / File 23 of 73
Checking: GFPGANv1.3.pth (348632874 bytes / 332.48 MB) / File 24 of 73
Checking: GFPGANv1.4.pth (348632874 bytes / 332.48 MB) / File 25 of 73
Checking: GPEN-BFR-1024.onnx (285101993 bytes / 271.89 MB) / File 26 of 73
Checking: GPEN-BFR-2048.onnx (285469146 bytes / 272.24 MB) / File 27 of 73
Checking: GPEN-BFR-512.onnx (284244491 bytes / 271.08 MB) / File 28 of 73
Checking: inswapper_128.onnx (555302286 bytes / 529.58 MB) / File 29 of 73
Checking: inswapper_128_fp16.onnx (277680638 bytes / 264.82 MB) / File 30 of 73
Checking: 1k3d68.onnx (143607619 bytes / 136.95 MB) / File 31 of 73
Checking: 2d106det.onnx (5030888 bytes / 4.8 MB) / File 32 of 73
Checking: det_10g.onnx (16923827 bytes / 16.14 MB) / File 33 of 73
Checking: genderage.onnx (1322532 bytes / 1.26 MB) / File 34 of 73
Checking: w600k_r50.onnx (174383860 bytes / 166.31 MB) / File 35 of 73
Checking: Flux-uncensored-v2.safetensors (687476088 bytes / 655.63 MB) / File 36 of 73
Checking: Flux-uncensored.safetensors (687476088 bytes / 655.63 MB) / File 37 of 73
Checking: petra-fyed.safetensors (39757152 bytes / 37.92 MB) / File 38 of 73
Checking: model.safetensors (344391328 bytes / 328.44 MB) / File 39 of 73
Checking: inswapper_128.onnx (555302286 bytes / 529.58 MB) / File 40 of 73
Checking: inswapper_128_fp16.onnx (277680638 bytes / 264.82 MB) / File 41 of 73
Checking: sam_vit_b_01ec64.pth (375042383 bytes / 357.67 MB) / File 42 of 73
Checking: black-forest-labs-FLUX.1-Krea-dev.model.safetensors (246144352 bytes / 234.74 MB) / File 43 of 73
Checking: flux-kontext.safetensors (246144352 bytes / 234.74 MB) / File 44 of 73
Checking: llava_llama3_fp8_scaled.safetensors (9091392483 bytes / 8670.23 MB) / File 45 of 73
Checking: t5xxl_fp8_e4m3fn_scaled.safetensors (5157348688 bytes / 4918.43 MB) / File 46 of 73
Checking: face_yolov8m.pt (52026019 bytes / 49.62 MB) / File 47 of 73
Checking: hand_yolov8s.pt (22507707 bytes / 21.47 MB) / File 48 of 73
Checking: person_yolov8m-seg.pt (54827683 bytes / 52.29 MB) / File 49 of 73
Checking: flux1-dev.safetensors (23802932552 bytes / 22700.25 MB) / File 50 of 73
Checking: hunyuanVideoSafetensors_comfyDiffusionFP8.safetensors (12821122144 bytes / 12227.17 MB) / File 51 of 73
Checking: flux1-krea-dev.safetensors (23802958224 bytes / 22700.27 MB) / File 52 of 73
Checking: 4x_NMKD-Siax_200k.pth (66957746 bytes / 63.86 MB) / File 53 of 73
Checking: RealESRGAN_x4plus.pth (67040989 bytes / 63.94 MB) / File 54 of 73
Checking: ae.safetensors (335304388 bytes / 319.77 MB) / File 55 of 73
Checking: ae_1.sft (335304388 bytes / 319.77 MB) / File 56 of 73
Checking: black-forest-labs-FLUX.1-Krea-dev.ae.safetensors (335304388 bytes / 319.77 MB) / File 57 of 73
Checking: black-forest-labs-FLUX.1-Krea-dev.diffusion_pytorch_model.safetensors (167666902 bytes / 159.9 MB) / File 58
of 73
Checking: flux-kontext-dev.diffusion_pytorch_model.safetensors (167666902 bytes / 159.9 MB) / File 59 of 73
Checking: flux1-dev.vae.safetensors (167666902 bytes / 159.9 MB) / File 60 of 73
Checking: hunyuan_video_vae_bf16.safetensors (492986478 bytes / 470.15 MB) / File 61 of 73
Checking: LTX_vae.safetensors (838420506 bytes / 799.58 MB) / File 62 of 73
Checking: taef1_decoder.pth (4941047 bytes / 4.71 MB) / File 63 of 73
Checking: taef1_encoder.pth (4940983 bytes / 4.71 MB) / File 64 of 73
Checking: taesd3_decoder.pth (4940671 bytes / 4.71 MB) / File 65 of 73
Checking: taesd3_encoder.pth (4940607 bytes / 4.71 MB) / File 66 of 73
Checking: taesdxl_decoder.pth (4913092 bytes / 4.69 MB) / File 67 of 73
Checking: taesdxl_encoder.pth (4912964 bytes / 4.69 MB) / File 68 of 73
Checking: taesd_decoder.pth (4912954 bytes / 4.69 MB) / File 69 of 73
Checking: taesd_encoder.pth (4912826 bytes / 4.69 MB) / File 70 of 73
Checking: flux-canny-controlnet-v3.safetensors (1487623552 bytes / 1418.71 MB) / File 71 of 73
Checking: flux-depth-controlnet-v3.safetensors (1487623552 bytes / 1418.71 MB) / File 72 of 73
Checking: flux-hed-controlnet-v3.safetensors (1487623552 bytes / 1418.71 MB) / File 73 of 73
Found 11 sets of potential duplicates
= * 80
STEP 2: Verifying potential duplicates by hash...
= * 80
Verifying Set #1 (2 files @ 9787841024 bytes)...
Hashing: flux1-dev.1.t5xxl_fp16..safetensors (9787841024 bytes / 9334.41 MB) / File 1 of 2
Hash: 6E480B09FAE049A72D2A8C5FBCCB8D3E92FEBEB233BBE9DFE7256958A9167635
Hashing: t5xxl_fp16.safetensors (9787841024 bytes / 9334.41 MB) / File 2 of 2
Hash: 6E480B09FAE049A72D2A8C5FBCCB8D3E92FEBEB233BBE9DFE7256958A9167635
CONFIRMED DUPLICATES (Hash: 6E480B09FAE049A72D2A8C5FBCCB8D3E92FEBEB233BBE9DFE7256958A9167635)
Found 2 identical copies:
[1] X:\github\ComfyUI\models\clip\flux1-dev.1.t5xxl_fp16..safetensors
[2] X:\github\ComfyUI\models\clip\t5xxl_fp16.safetensors
These files are in the SAME FOLDER
Rename duplicates to .dupe? Keep file [1] and rename the rest? (y/n): y
Renamed to backup: X:\github\ComfyUI\models\clip\t5xxl_fp16.safetensors.dupe
Note: Original files renamed to .dupe - you can delete them manually if everything works
Verifying Set #2 (3 files @ 1487623552 bytes)...
Hashing: flux-canny-controlnet-v3.safetensors (1487623552 bytes / 1418.71 MB) / File 1 of 3
Hash: 6546F29049796101A6370DB0A43D2671D0294287C36B2B4E8792CF9E68F0EAF0
Hashing: flux-depth-controlnet-v3.safetensors (1487623552 bytes / 1418.71 MB) / File 2 of 3
Hash: D52EEAF8072DE89D72B1EE79E3BDC79B5D795ED0A6881A029BBE13D833DF7E5F
Hashing: flux-hed-controlnet-v3.safetensors (1487623552 bytes / 1418.71 MB) / File 3 of 3
Hash: 31C110AE4557C19F1C2D74F824C20EA4C340161106DF25CF7CAF298E2F22F441
Verifying Set #3 (2 files @ 687476088 bytes)...
Hashing: Flux-uncensored-v2.safetensors (687476088 bytes / 655.63 MB) / File 1 of 2
Hash: 5266ABDF521187EFBB37F93F3C0F583E576056DAF9BC689D9E49E8CF76F28068
Hashing: Flux-uncensored.safetensors (687476088 bytes / 655.63 MB) / File 2 of 2
Hash: 5AD714D27DEA0BFDDCE0599AB860D249C0356F7E097F9E68EF7160ED535EF93B
Verifying Set #4 (2 files @ 555302286 bytes)...
Hashing: inswapper_128.onnx (555302286 bytes / 529.58 MB) / File 1 of 2
Hash: 0FA95F167682B4F61EDF24F8D66C46B4AB130E8BE058F00C8150E6D0170CA72F
Hashing: inswapper_128.onnx (555302286 bytes / 529.58 MB) / File 2 of 2
Hash: 0FA95F167682B4F61EDF24F8D66C46B4AB130E8BE058F00C8150E6D0170CA72F
CONFIRMED DUPLICATES (Hash: 0FA95F167682B4F61EDF24F8D66C46B4AB130E8BE058F00C8150E6D0170CA72F)
Found 2 identical copies:
[1] X:\github\ComfyUI\models\insightface\inswapper_128.onnx
[2] X:\github\ComfyUI\models\roop\inswapper_128.onnx
These files are in DIFFERENT FOLDERS
Which file should be the MASTER (keep original)?
Enter number (1-2) or 's' to skip: 1
Master file: X:\github\ComfyUI\models\insightface\inswapper_128.onnx
Replace other files with hard links to master? (y/n): y
Renamed original to backup: X:\github\ComfyUI\models\roop\inswapper_128.onnx.dupe
Created hard link: X:\github\ComfyUI\models\roop\inswapper_128.onnx -> (master: X:\github\ComfyUI\models\insightfa
ce\inswapper_128.onnx)
Info file created: X:\github\ComfyUI\models\roop\inswapper_128.onnx.link_info.txt
Original backed up to: X:\github\ComfyUI\models\roop\inswapper_128.onnx.dupe
Note: Original files backed up as .dupe - delete them manually once you verify links work
Verifying Set #5 (2 files @ 348632874 bytes)...
Hashing: GFPGANv1.3.pth (348632874 bytes / 332.48 MB) / File 1 of 2
Hash: C953A88F2727C85C3D9AE72E2BD4846BBAF59FE6972AD94130E23E7017524A70
Hashing: GFPGANv1.4.pth (348632874 bytes / 332.48 MB) / File 2 of 2
Hash: E2CD4703AB14F4D01FD1383A8A8B266F9A5833DACEE8E6A79D3BF21A1B6BE5AD
Verifying Set #6 (3 files @ 335304388 bytes)...
Hashing: ae.safetensors (335304388 bytes / 319.77 MB) / File 1 of 3
Hash: AFC8E28272CD15DB3919BACDB6918CE9C1ED22E96CB12C4D5ED0FBA823529E38
Hashing: ae_1.sft (335304388 bytes / 319.77 MB) / File 2 of 3
Hash: AFC8E28272CD15DB3919BACDB6918CE9C1ED22E96CB12C4D5ED0FBA823529E38
Hashing: black-forest-labs-FLUX.1-Krea-dev.ae.safetensors (335304388 bytes / 319.77 MB) / File 3 of 3
Hash: AFC8E28272CD15DB3919BACDB6918CE9C1ED22E96CB12C4D5ED0FBA823529E38
CONFIRMED DUPLICATES (Hash: AFC8E28272CD15DB3919BACDB6918CE9C1ED22E96CB12C4D5ED0FBA823529E38)
Found 3 identical copies:
[1] X:\github\ComfyUI\models\vae\ae.safetensors
[2] X:\github\ComfyUI\models\vae\ae_1.sft
[3] X:\github\ComfyUI\models\vae\black-forest-labs-FLUX.1-Krea-dev.ae.safetensors
These files are in the SAME FOLDER
Rename duplicates to .dupe? Keep file [1] and rename the rest? (y/n): y
Renamed to backup: X:\github\ComfyUI\models\vae\ae_1.sft.dupe
Renamed to backup: X:\github\ComfyUI\models\vae\black-forest-labs-FLUX.1-Krea-dev.ae.safetensors.dupe
Note: Original files renamed to .dupe - you can delete them manually if everything works
Verifying Set #7 (2 files @ 277680638 bytes)...
Hashing: inswapper_128_fp16.onnx (277680638 bytes / 264.82 MB) / File 1 of 2
Hash: 6D51A9278A1F650CFFEFC18BA53F38BF2769BF4BBFF89267822CF72945F8A38B
Hashing: inswapper_128_fp16.onnx (277680638 bytes / 264.82 MB) / File 2 of 2
Hash: 6D51A9278A1F650CFFEFC18BA53F38BF2769BF4BBFF89267822CF72945F8A38B
CONFIRMED DUPLICATES (Hash: 6D51A9278A1F650CFFEFC18BA53F38BF2769BF4BBFF89267822CF72945F8A38B)
Found 2 identical copies:
[1] X:\github\ComfyUI\models\insightface\inswapper_128_fp16.onnx
[2] X:\github\ComfyUI\models\roop\inswapper_128_fp16.onnx
These files are in DIFFERENT FOLDERS
Which file should be the MASTER (keep original)?
Enter number (1-2) or 's' to skip: 1
Master file: X:\github\ComfyUI\models\insightface\inswapper_128_fp16.onnx
Replace other files with hard links to master? (y/n): y
Renamed original to backup: X:\github\ComfyUI\models\roop\inswapper_128_fp16.onnx.dupe
Created hard link: X:\github\ComfyUI\models\roop\inswapper_128_fp16.onnx -> (master: X:\github\ComfyUI\models\insi
ghtface\inswapper_128_fp16.onnx)
Info file created: X:\github\ComfyUI\models\roop\inswapper_128_fp16.onnx.link_info.txt
Original backed up to: X:\github\ComfyUI\models\roop\inswapper_128_fp16.onnx.dupe
Note: Original files backed up as .dupe - delete them manually once you verify links work
Verifying Set #8 (2 files @ 246144352 bytes)...
Hashing: black-forest-labs-FLUX.1-Krea-dev.model.safetensors (246144352 bytes / 234.74 MB) / File 1 of 2
Hash: 893D67A23F4693ED42CDAB4CBAD7FE3E727CF59609C40DA28A46B5470F9ED082
Hashing: flux-kontext.safetensors (246144352 bytes / 234.74 MB) / File 2 of 2
Hash: 893D67A23F4693ED42CDAB4CBAD7FE3E727CF59609C40DA28A46B5470F9ED082
CONFIRMED DUPLICATES (Hash: 893D67A23F4693ED42CDAB4CBAD7FE3E727CF59609C40DA28A46B5470F9ED082)
Found 2 identical copies:
[1] X:\github\ComfyUI\models\text_encoders\black-forest-labs-FLUX.1-Krea-dev.model.safetensors
[2] X:\github\ComfyUI\models\text_encoders\flux-kontext.safetensors
These files are in the SAME FOLDER
Rename duplicates to .dupe? Keep file [1] and rename the rest? (y/n): y
Renamed to backup: X:\github\ComfyUI\models\text_encoders\flux-kontext.safetensors.dupe
Note: Original files renamed to .dupe - you can delete them manually if everything works
Verifying Set #9 (2 files @ 246144152 bytes)...
Hashing: clip_l.safetensors (246144152 bytes / 234.74 MB) / File 1 of 2
Hash: 660C6F5B1ABAE9DC498AC2D21E1347D2ABDB0CF6C0C0C8576CD796491D9A6CDD
Hashing: flux1-dev.clip_l.safetensors (246144152 bytes / 234.74 MB) / File 2 of 2
Hash: 660C6F5B1ABAE9DC498AC2D21E1347D2ABDB0CF6C0C0C8576CD796491D9A6CDD
CONFIRMED DUPLICATES (Hash: 660C6F5B1ABAE9DC498AC2D21E1347D2ABDB0CF6C0C0C8576CD796491D9A6CDD)
Found 2 identical copies:
[1] X:\github\ComfyUI\models\clip\clip_l.safetensors
[2] X:\github\ComfyUI\models\clip\flux1-dev.clip_l.safetensors
These files are in the SAME FOLDER
Rename duplicates to .dupe? Keep file [1] and rename the rest? (y/n): y
Renamed to backup: X:\github\ComfyUI\models\clip\flux1-dev.clip_l.safetensors.dupe
Note: Original files renamed to .dupe - you can delete them manually if everything works
Verifying Set #10 (3 files @ 167666902 bytes)...
Hashing: black-forest-labs-FLUX.1-Krea-dev.diffusion_pytorch_model.safetensors (167666902 bytes / 159.9 MB) / File 1
of 3
Hash: F5B59A26851551B67AE1FE58D32E76486E1E812DEF4696A4BEA97F16604D40A3
Hashing: flux-kontext-dev.diffusion_pytorch_model.safetensors (167666902 bytes / 159.9 MB) / File 2 of 3
Hash: F5B59A26851551B67AE1FE58D32E76486E1E812DEF4696A4BEA97F16604D40A3
Hashing: flux1-dev.vae.safetensors (167666902 bytes / 159.9 MB) / File 3 of 3
Hash: F5B59A26851551B67AE1FE58D32E76486E1E812DEF4696A4BEA97F16604D40A3
CONFIRMED DUPLICATES (Hash: F5B59A26851551B67AE1FE58D32E76486E1E812DEF4696A4BEA97F16604D40A3)
Found 3 identical copies:
[1] X:\github\ComfyUI\models\vae\black-forest-labs-FLUX.1-Krea-dev.diffusion_pytorch_model.safetensors
[2] X:\github\ComfyUI\models\vae\flux-kontext-dev.diffusion_pytorch_model.safetensors
[3] X:\github\ComfyUI\models\vae\flux1-dev.vae.safetensors
These files are in the SAME FOLDER
Rename duplicates to .dupe? Keep file [1] and rename the rest? (y/n): y
Renamed to backup: X:\github\ComfyUI\models\vae\flux-kontext-dev.diffusion_pytorch_model.safetensors.dupe
Renamed to backup: X:\github\ComfyUI\models\vae\flux1-dev.vae.safetensors.dupe
Note: Original files renamed to .dupe - you can delete them manually if everything works
Verifying Set #11 (2 files @ 2006 bytes)...
Hashing: anything_v3.yaml (2006 bytes / 0 MB) / File 1 of 2
Hash: 9CEB1178472945273CFE1750B145505B184CAA619CE1622F4276AF416B43C9BD
Hashing: v1-inference_clip_skip_2.yaml (2006 bytes / 0 MB) / File 2 of 2
Hash: 9CEB1178472945273CFE1750B145505B184CAA619CE1622F4276AF416B43C9BD
CONFIRMED DUPLICATES (Hash: 9CEB1178472945273CFE1750B145505B184CAA619CE1622F4276AF416B43C9BD)
Found 2 identical copies:
[1] X:\github\ComfyUI\models\configs\anything_v3.yaml
[2] X:\github\ComfyUI\models\configs\v1-inference_clip_skip_2.yaml
These files are in the SAME FOLDER
Rename duplicates to .dupe? Keep file [1] and rename the rest? (y/n): y
Renamed to backup: X:\github\ComfyUI\models\configs\v1-inference_clip_skip_2.yaml.dupe
Note: Original files renamed to .dupe - you can delete them manually if everything works
= * 80
COMPLETE!
= * 80
Summary:
Confirmed duplicates: 8 sets
False positives: 3 sets
Actions taken: 10
Space that can be freed: 12119057038 bytes (11.287 GB)
Report saved to: X:\github\ComfyUI\models\confirmed_duplicates_report.txt
Action log saved to: X:\github\ComfyUI\models\duplicate_actions_log.txt
(base) PS X:\github\ComfyUI\models>
Last active
December 28, 2025 12:15
-
-
Save cyberofficial/58c13068b07fa88f0ed4144494d6930a to your computer and use it in GitHub Desktop.
Dupe Finder, Find duplicate files in a dir. Was meant for llms, but can be applied to different applications.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| # Complete Duplicate Finder and Manager | |
| # This script finds duplicates by size, verifies by hash, and manages them | |
| # Change modelsPath to main folder to search in. | |
| # OS: Windows Only | |
| # Features: Find duplcate files, create hard links between files, shows you files that can be removed if the hard link works | |
| # You can also undo the actions, if something doesn't work quite right. But you should keep the .dupe files until | |
| # are 100% sure you can remove the .dupe files. | |
| $modelsPath = "X:\path\to\folder" | |
| $sizeReportFile = "X:\path\to\folder\model_duplicates_report.txt" | |
| $confirmedReportFile = "X:\path\to\folder\confirmed_duplicates_report.txt" | |
| $actionLogFile = "X:\path\to\folder\duplicate_actions_log.txt" | |
| Write-Host "=" * 80 -ForegroundColor Cyan | |
| Write-Host "ComfyUI Model Duplicate Finder and Manager" -ForegroundColor Cyan | |
| Write-Host "=" * 80 -ForegroundColor Cyan | |
| Write-Host "" | |
| # Check if previous actions exist | |
| if (Test-Path $actionLogFile) { | |
| Write-Host "Previous action log found!" -ForegroundColor Yellow | |
| Write-Host "" | |
| Write-Host "Options:" -ForegroundColor Cyan | |
| Write-Host " [1] Undo previous actions (restore .dupe files, remove hard links)" -ForegroundColor White | |
| Write-Host " [2] Start fresh (find and manage duplicates)" -ForegroundColor White | |
| Write-Host " [3] Exit" -ForegroundColor White | |
| Write-Host "" | |
| $choice = Read-Host "Enter your choice (1/2/3)" | |
| if ($choice -eq "1") { | |
| Write-Host "" | |
| Write-Host "=" * 80 -ForegroundColor Yellow | |
| Write-Host "UNDOING PREVIOUS ACTIONS" -ForegroundColor Yellow | |
| Write-Host "=" * 80 -ForegroundColor Yellow | |
| Write-Host "" | |
| # Parse the action log | |
| $logContent = Get-Content -Path $actionLogFile | |
| $undoActions = @() | |
| foreach ($line in $logContent) { | |
| # Restore renamed .dupe files | |
| if ($line -match '^Renamed to \.dupe: (.+)$') { | |
| $dupeFile = $matches[1] | |
| if (Test-Path $dupeFile) { | |
| $originalFile = $dupeFile -replace '\.dupe$', '' | |
| try { | |
| Rename-Item -Path $dupeFile -NewName $originalFile -Force | |
| Write-Host "Restored: $originalFile" -ForegroundColor Green | |
| $undoActions += "Restored: $originalFile" | |
| } | |
| catch { | |
| Write-Host "ERROR restoring $dupeFile : $_" -ForegroundColor Red | |
| } | |
| } else { | |
| Write-Host "Skipped (not found): $dupeFile" -ForegroundColor Yellow | |
| } | |
| } | |
| # Remove hard links and restore backups | |
| if ($line -match '^Hard Link: (.+) -> \(master: (.+)\)$') { | |
| $linkFile = $matches[1] | |
| $dupeBackup = "$linkFile.dupe" | |
| $infoFile = "$linkFile.link_info.txt" | |
| # Remove the hard link | |
| if (Test-Path $linkFile) { | |
| try { | |
| Remove-Item -Path $linkFile -Force | |
| Write-Host "Removed hard link: $linkFile" -ForegroundColor Yellow | |
| $undoActions += "Removed link: $linkFile" | |
| } | |
| catch { | |
| Write-Host "ERROR removing link $linkFile : $_" -ForegroundColor Red | |
| } | |
| } | |
| # Restore the backup | |
| if (Test-Path $dupeBackup) { | |
| try { | |
| Rename-Item -Path $dupeBackup -NewName $linkFile -Force | |
| Write-Host "Restored backup: $linkFile" -ForegroundColor Green | |
| $undoActions += "Restored: $linkFile" | |
| } | |
| catch { | |
| Write-Host "ERROR restoring backup $dupeBackup : $_" -ForegroundColor Red | |
| } | |
| } | |
| # Remove info file | |
| if (Test-Path $infoFile) { | |
| try { | |
| Remove-Item -Path $infoFile -Force | |
| Write-Host "Removed info file: $infoFile" -ForegroundColor Cyan | |
| } | |
| catch { | |
| Write-Host "ERROR removing info file $infoFile : $_" -ForegroundColor Red | |
| } | |
| } | |
| } | |
| } | |
| Write-Host "" | |
| Write-Host "=" * 80 -ForegroundColor Green | |
| Write-Host "UNDO COMPLETE" -ForegroundColor Green | |
| Write-Host "=" * 80 -ForegroundColor Green | |
| Write-Host "" | |
| Write-Host "Actions undone: $($undoActions.Count)" -ForegroundColor Green | |
| Write-Host "" | |
| # Archive the old log | |
| $timestamp = Get-Date -Format 'yyyyMMdd_HHmmss' | |
| $archiveLog = "$actionLogFile.$timestamp.archived" | |
| Move-Item -Path $actionLogFile -Destination $archiveLog -Force | |
| Write-Host "Previous log archived to: $archiveLog" -ForegroundColor Cyan | |
| exit | |
| } | |
| elseif ($choice -eq "3") { | |
| Write-Host "Exiting..." -ForegroundColor Yellow | |
| exit | |
| } | |
| Write-Host "" | |
| Write-Host "Starting fresh duplicate scan..." -ForegroundColor Green | |
| Write-Host "" | |
| } | |
| # STEP 1: Find potential duplicates by file size | |
| Write-Host "STEP 1: Scanning for files with identical sizes..." -ForegroundColor Yellow | |
| Write-Host "" | |
| $files = Get-ChildItem -Path $modelsPath -Recurse -File | Where-Object { | |
| $_.Extension -match '\.(safetensors|pth|onnx|ckpt|pt|bin|sft|yaml)$' -and $_.Name -notmatch '^(put_|\.git)' | |
| } | |
| Write-Host "Found $($files.Count) model files to check" -ForegroundColor Green | |
| Write-Host "" | |
| $sizeTable = @{} | |
| $counter = 0 | |
| foreach ($file in $files) { | |
| $counter++ | |
| $sizeBytes = $file.Length | |
| $sizeMB = [math]::Round($sizeBytes / 1MB, 2) | |
| Write-Host "Checking: $($file.Name) ($sizeBytes bytes / $sizeMB MB) / File $counter of $($files.Count)" -ForegroundColor Cyan | |
| $fileInfo = [PSCustomObject]@{ | |
| FileName = $file.Name | |
| FullPath = $file.FullName | |
| SizeBytes = $sizeBytes | |
| SizeMB = $sizeMB | |
| SizeGB = [math]::Round($sizeBytes / 1GB, 3) | |
| } | |
| if ($sizeTable.ContainsKey($sizeBytes)) { | |
| $sizeTable[$sizeBytes] += $fileInfo | |
| } else { | |
| $sizeTable[$sizeBytes] = @($fileInfo) | |
| } | |
| } | |
| $potentialDuplicates = $sizeTable.GetEnumerator() | Where-Object { $_.Value.Count -gt 1 } | Sort-Object { $_.Value[0].SizeBytes } -Descending | |
| Write-Host "" | |
| Write-Host "Found $($potentialDuplicates.Count) sets of potential duplicates" -ForegroundColor Green | |
| Write-Host "" | |
| if ($potentialDuplicates.Count -eq 0) { | |
| Write-Host "No duplicates found - all files have unique sizes!" -ForegroundColor Green | |
| exit | |
| } | |
| # STEP 2: Verify duplicates by hashing | |
| Write-Host "=" * 80 -ForegroundColor Yellow | |
| Write-Host "STEP 2: Verifying potential duplicates by hash..." -ForegroundColor Yellow | |
| Write-Host "=" * 80 -ForegroundColor Yellow | |
| Write-Host "" | |
| $report = @() | |
| $report += "=" * 80 | |
| $report += "CONFIRMED DUPLICATES REPORT" | |
| $report += "Generated: $(Get-Date -Format 'yyyy-MM-dd HH:mm:ss')" | |
| $report += "=" * 80 | |
| $report += "" | |
| $actionLog = @() | |
| $actionLog += "=" * 80 | |
| $actionLog += "DUPLICATE MANAGEMENT ACTION LOG" | |
| $actionLog += "Generated: $(Get-Date -Format 'yyyy-MM-dd HH:mm:ss')" | |
| $actionLog += "=" * 80 | |
| $actionLog += "" | |
| $confirmedDuplicates = 0 | |
| $falsePositives = 0 | |
| $totalSpaceSavings = [long]0 | |
| $actionsTaken = @() | |
| $setNumber = 0 | |
| foreach ($potentialSet in $potentialDuplicates) { | |
| $setNumber++ | |
| $sizeBytes = $potentialSet.Key | |
| $filesInSet = $potentialSet.Value | |
| Write-Host "Verifying Set #$setNumber ($($filesInSet.Count) files @ $sizeBytes bytes)..." -ForegroundColor Cyan | |
| $hashTable = @{} | |
| $fileCounter = 0 | |
| foreach ($fileInfo in $filesInSet) { | |
| $fileCounter++ | |
| $sizeMB = [math]::Round($sizeBytes / 1MB, 2) | |
| Write-Host " Hashing: $($fileInfo.FileName) ($sizeBytes bytes / $sizeMB MB) / File $fileCounter of $($filesInSet.Count)" -ForegroundColor Yellow | |
| try { | |
| if (Test-Path $fileInfo.FullPath) { | |
| $hash = Get-FileHash -Path $fileInfo.FullPath -Algorithm SHA256 | |
| Write-Host " Hash: $($hash.Hash)" -ForegroundColor DarkGray | |
| if ($hashTable.ContainsKey($hash.Hash)) { | |
| $hashTable[$hash.Hash] += $fileInfo.FullPath | |
| } else { | |
| $hashTable[$hash.Hash] = @($fileInfo.FullPath) | |
| } | |
| } | |
| } | |
| catch { | |
| Write-Host " ERROR hashing $($fileInfo.FullPath) : $_" -ForegroundColor Red | |
| } | |
| } | |
| $actualDuplicates = $hashTable.GetEnumerator() | Where-Object { $_.Value.Count -gt 1 } | |
| if ($actualDuplicates.Count -gt 0) { | |
| $confirmedDuplicates++ | |
| $report += "CONFIRMED DUPLICATE SET #$setNumber" | |
| $report += "-" * 80 | |
| $report += "File size: $sizeBytes bytes ($([math]::Round($sizeBytes / 1MB, 2)) MB / $([math]::Round($sizeBytes / 1GB, 3)) GB)" | |
| $report += "Status: IDENTICAL FILES (same hash)" | |
| $report += "" | |
| foreach ($dup in $actualDuplicates) { | |
| $report += "Hash: $($dup.Key)" | |
| $report += "Number of identical copies: $($dup.Value.Count)" | |
| $potentialSavings = $sizeBytes * ($dup.Value.Count - 1) | |
| $totalSpaceSavings += $potentialSavings | |
| $report += "Space that can be freed: $potentialSavings bytes ($([math]::Round($potentialSavings / 1MB, 2)) MB / $([math]::Round($potentialSavings / 1GB, 3)) GB)" | |
| $report += "" | |
| $report += "Files:" | |
| foreach ($file in $dup.Value) { | |
| $report += " - $file" | |
| } | |
| $report += "" | |
| $folders = $dup.Value | ForEach-Object { Split-Path $_ -Parent } | Select-Object -Unique | |
| if ($folders.Count -eq 1) { | |
| $report += "RECOMMENDATION: Same folder duplicates - can safely rename extras to .dupe" | |
| } else { | |
| $report += "RECOMMENDATION: Cross-folder duplicates - consider hard links" | |
| } | |
| $report += "" | |
| # Ask user what to do | |
| Write-Host "" | |
| Write-Host " CONFIRMED DUPLICATES (Hash: $($dup.Key))" -ForegroundColor Green | |
| Write-Host " Found $($dup.Value.Count) identical copies:" -ForegroundColor Green | |
| for ($i = 0; $i -lt $dup.Value.Count; $i++) { | |
| Write-Host " [$($i+1)] $($dup.Value[$i])" -ForegroundColor White | |
| } | |
| Write-Host "" | |
| if ($folders.Count -eq 1) { | |
| Write-Host " These files are in the SAME FOLDER" -ForegroundColor Yellow | |
| $response = Read-Host " Rename duplicates to .dupe? Keep file [1] and rename the rest? (y/n)" | |
| if ($response -eq 'y') { | |
| $keepFile = $dup.Value[0] | |
| $report += "ACTION: Keeping file [1], renaming others to .dupe" | |
| $actionLog += "Duplicate Set #$setNumber - Same Folder - Renamed to .dupe" | |
| $actionLog += "Kept: $keepFile" | |
| for ($i = 1; $i -lt $dup.Value.Count; $i++) { | |
| try { | |
| $dupeBackup = "$($dup.Value[$i]).dupe" | |
| Rename-Item -Path $dup.Value[$i] -NewName "$($dup.Value[$i]).dupe" -Force | |
| Write-Host " Renamed to backup: $dupeBackup" -ForegroundColor Green | |
| $report += " RENAMED TO BACKUP: $dupeBackup" | |
| $actionLog += "Renamed to .dupe: $dupeBackup" | |
| $actionsTaken += "Renamed: $($dup.Value[$i]) -> $dupeBackup" | |
| } | |
| catch { | |
| Write-Host " ERROR renaming: $($dup.Value[$i]) - $_" -ForegroundColor Red | |
| $report += " ERROR RENAMING: $($dup.Value[$i])" | |
| } | |
| } | |
| $actionLog += "" | |
| Write-Host " Note: Original files renamed to .dupe - you can delete them manually if everything works" -ForegroundColor Yellow | |
| } else { | |
| $report += "ACTION: User chose to skip" | |
| Write-Host " Skipped" -ForegroundColor Yellow | |
| } | |
| } else { | |
| Write-Host " These files are in DIFFERENT FOLDERS" -ForegroundColor Cyan | |
| Write-Host " Which file should be the MASTER (keep original)?" -ForegroundColor Yellow | |
| $masterChoice = Read-Host " Enter number (1-$($dup.Value.Count)) or 's' to skip" | |
| if ($masterChoice -match '^\d+$' -and [int]$masterChoice -ge 1 -and [int]$masterChoice -le $dup.Value.Count) { | |
| $masterFile = $dup.Value[[int]$masterChoice - 1] | |
| Write-Host " Master file: $masterFile" -ForegroundColor Green | |
| Write-Host "" | |
| $response = Read-Host " Replace other files with hard links to master? (y/n)" | |
| if ($response -eq 'y') { | |
| $report += "ACTION: Created hard links to master file [${masterChoice}]" | |
| $report += "Master: $masterFile" | |
| $actionLog += "Duplicate Set #$setNumber - Hard Link Creation" | |
| $actionLog += "Master: $masterFile" | |
| for ($i = 0; $i -lt $dup.Value.Count; $i++) { | |
| if ($i -ne ([int]$masterChoice - 1)) { | |
| $linkFile = $dup.Value[$i] | |
| $dupeBackup = "$linkFile.dupe" | |
| try { | |
| Rename-Item -Path $linkFile -NewName "$linkFile.dupe" -Force | |
| Write-Host " Renamed original to backup: $dupeBackup" -ForegroundColor Yellow | |
| New-Item -ItemType HardLink -Path $linkFile -Value $masterFile -Force | Out-Null | |
| Write-Host " Created hard link: $linkFile -> (master: $masterFile)" -ForegroundColor Green | |
| $linkInfoFile = "$linkFile.link_info.txt" | |
| $linkInfo = "================================================================================" + "`r`n" | |
| $linkInfo += "HARD LINK INFORMATION" + "`r`n" | |
| $linkInfo += "================================================================================" + "`r`n" | |
| $linkInfo += "Created: $(Get-Date -Format 'yyyy-MM-dd HH:mm:ss')" + "`r`n" | |
| $linkInfo += "Link Location: $linkFile" + "`r`n" | |
| $linkInfo += "Master File: $masterFile" + "`r`n" | |
| $linkInfo += "Original Backup: $dupeBackup" + "`r`n" | |
| $linkInfo += "Hash: $($dup.Key)" + "`r`n" | |
| $linkInfo += "File Size: $sizeBytes bytes ($([math]::Round($sizeBytes / 1MB, 2)) MB)" + "`r`n" | |
| $linkInfo += "" + "`r`n" | |
| $linkInfo += "IMPORTANT:" + "`r`n" | |
| $linkInfo += "- This file is a hard link, not a copy" + "`r`n" | |
| $linkInfo += "- Both paths point to the same data on disk" + "`r`n" | |
| $linkInfo += "- Original file renamed to: $dupeBackup" + "`r`n" | |
| $linkInfo += "- You can delete the .dupe file once you verify the link works" + "`r`n" | |
| $linkInfo += "- Deleting the link does NOT delete the master" + "`r`n" | |
| $linkInfo += "- Modifying one modifies both (they are the same file)" + "`r`n" | |
| $linkInfo += "- To restore original: delete this link and rename .dupe file back" + "`r`n" | |
| $linkInfo += "================================================================================" + "`r`n" | |
| $linkInfo | Out-File -FilePath $linkInfoFile -Encoding UTF8 | |
| Write-Host " Info file created: $linkInfoFile" -ForegroundColor Cyan | |
| Write-Host " Original backed up to: $dupeBackup" -ForegroundColor Cyan | |
| $report += " HARD LINK: $linkFile -> (master: $masterFile)" | |
| $report += " INFO FILE: $linkInfoFile" | |
| $report += " BACKUP: $dupeBackup" | |
| $actionLog += "Hard Link: $linkFile -> (master: $masterFile)" | |
| $actionLog += "Info File: $linkInfoFile" | |
| $actionLog += "Backup: $dupeBackup" | |
| $actionsTaken += "Hard Linked: $linkFile" | |
| } | |
| catch { | |
| Write-Host " ERROR creating hard link: $linkFile - $_" -ForegroundColor Red | |
| $report += " ERROR CREATING HARD LINK: $linkFile - $_" | |
| $actionLog += "ERROR: Failed to create hard link for $linkFile - $_" | |
| if (Test-Path $dupeBackup) { | |
| try { | |
| Rename-Item -Path $dupeBackup -NewName $linkFile -Force | |
| Write-Host " Restored backup after failure" -ForegroundColor Yellow | |
| } | |
| catch { | |
| Write-Host " ERROR: Could not restore backup! Manual intervention needed!" -ForegroundColor Red | |
| $actionLog += "CRITICAL: Failed to restore $dupeBackup after link failure" | |
| } | |
| } | |
| } | |
| } | |
| } | |
| Write-Host "" | |
| Write-Host " Note: Original files backed up as .dupe - delete them manually once you verify links work" -ForegroundColor Yellow | |
| } else { | |
| $report += "ACTION: User chose to skip hard linking" | |
| Write-Host " Skipped" -ForegroundColor Yellow | |
| } | |
| } else { | |
| $report += "ACTION: User chose to skip" | |
| Write-Host " Skipped" -ForegroundColor Yellow | |
| } | |
| } | |
| $report += "" | |
| } | |
| } else { | |
| $falsePositives++ | |
| $report += "FALSE POSITIVE - Set #$setNumber" | |
| $report += "-" * 80 | |
| $report += "File size: $sizeBytes bytes ($([math]::Round($sizeBytes / 1MB, 2)) MB)" | |
| $report += "Status: Different files (different hashes)" | |
| $report += "" | |
| } | |
| Write-Host "" | |
| } | |
| # Summary | |
| $report += "" | |
| $report += "=" * 80 | |
| $report += "SUMMARY" | |
| $report += "=" * 80 | |
| $report += "Potential duplicate sets checked: $setNumber" | |
| $report += "Confirmed duplicate sets: $confirmedDuplicates" | |
| $report += "False positives: $falsePositives" | |
| $report += "Total space that can be freed: $totalSpaceSavings bytes ($([math]::Round($totalSpaceSavings / 1MB, 2)) MB / $([math]::Round($totalSpaceSavings / 1GB, 3)) GB)" | |
| $report += "Actions taken: $($actionsTaken.Count)" | |
| $report += "" | |
| $report | Out-File -FilePath $confirmedReportFile -Encoding UTF8 | |
| $actionLog | Out-File -FilePath $actionLogFile -Encoding UTF8 | |
| Write-Host "=" * 80 -ForegroundColor Green | |
| Write-Host "COMPLETE!" -ForegroundColor Green | |
| Write-Host "=" * 80 -ForegroundColor Green | |
| Write-Host "" | |
| Write-Host "Summary:" -ForegroundColor Cyan | |
| Write-Host " Confirmed duplicates: $confirmedDuplicates sets" -ForegroundColor Green | |
| Write-Host " False positives: $falsePositives sets" -ForegroundColor Yellow | |
| Write-Host " Actions taken: $($actionsTaken.Count)" -ForegroundColor Green | |
| Write-Host " Space that can be freed: $totalSpaceSavings bytes ($([math]::Round($totalSpaceSavings / 1GB, 3)) GB)" -ForegroundColor Green | |
| Write-Host "" | |
| Write-Host "Report saved to: $confirmedReportFile" -ForegroundColor Green | |
| Write-Host "Action log saved to: $actionLogFile" -ForegroundColor Green |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| ================================================================================ | |
| CONFIRMED DUPLICATES REPORT | |
| Generated: 2025-12-28 06:54:17 | |
| ================================================================================ | |
| CONFIRMED DUPLICATE SET #1 | |
| -------------------------------------------------------------------------------- | |
| File size: 9787841024 bytes (9334.41 MB / 9.116 GB) | |
| Status: IDENTICAL FILES (same hash) | |
| Hash: 6E480B09FAE049A72D2A8C5FBCCB8D3E92FEBEB233BBE9DFE7256958A9167635 | |
| Number of identical copies: 2 | |
| Space that can be freed: 9787841024 bytes (9334.41 MB / 9.116 GB) | |
| Files: | |
| - X:\github\ComfyUI\models\clip\flux1-dev.1.t5xxl_fp16..safetensors | |
| - X:\github\ComfyUI\models\clip\t5xxl_fp16.safetensors | |
| RECOMMENDATION: Same folder duplicates - can safely rename extras to .dupe | |
| ACTION: Keeping file [1], renaming others to .dupe | |
| RENAMED TO BACKUP: X:\github\ComfyUI\models\clip\t5xxl_fp16.safetensors.dupe | |
| FALSE POSITIVE - Set #2 | |
| -------------------------------------------------------------------------------- | |
| File size: 1487623552 bytes (1418.71 MB) | |
| Status: Different files (different hashes) | |
| FALSE POSITIVE - Set #3 | |
| -------------------------------------------------------------------------------- | |
| File size: 687476088 bytes (655.63 MB) | |
| Status: Different files (different hashes) | |
| CONFIRMED DUPLICATE SET #4 | |
| -------------------------------------------------------------------------------- | |
| File size: 555302286 bytes (529.58 MB / 0.517 GB) | |
| Status: IDENTICAL FILES (same hash) | |
| Hash: 0FA95F167682B4F61EDF24F8D66C46B4AB130E8BE058F00C8150E6D0170CA72F | |
| Number of identical copies: 2 | |
| Space that can be freed: 555302286 bytes (529.58 MB / 0.517 GB) | |
| Files: | |
| - X:\github\ComfyUI\models\insightface\inswapper_128.onnx | |
| - X:\github\ComfyUI\models\roop\inswapper_128.onnx | |
| RECOMMENDATION: Cross-folder duplicates - consider hard links | |
| ACTION: Created hard links to master file [1] | |
| Master: X:\github\ComfyUI\models\insightface\inswapper_128.onnx | |
| HARD LINK: X:\github\ComfyUI\models\roop\inswapper_128.onnx -> (master: X:\github\ComfyUI\models\insightface\inswapper_128.onnx) | |
| INFO FILE: X:\github\ComfyUI\models\roop\inswapper_128.onnx.link_info.txt | |
| BACKUP: X:\github\ComfyUI\models\roop\inswapper_128.onnx.dupe | |
| FALSE POSITIVE - Set #5 | |
| -------------------------------------------------------------------------------- | |
| File size: 348632874 bytes (332.48 MB) | |
| Status: Different files (different hashes) | |
| CONFIRMED DUPLICATE SET #6 | |
| -------------------------------------------------------------------------------- | |
| File size: 335304388 bytes (319.77 MB / 0.312 GB) | |
| Status: IDENTICAL FILES (same hash) | |
| Hash: AFC8E28272CD15DB3919BACDB6918CE9C1ED22E96CB12C4D5ED0FBA823529E38 | |
| Number of identical copies: 3 | |
| Space that can be freed: 670608776 bytes (639.54 MB / 0.625 GB) | |
| Files: | |
| - X:\github\ComfyUI\models\vae\ae.safetensors | |
| - X:\github\ComfyUI\models\vae\ae_1.sft | |
| - X:\github\ComfyUI\models\vae\black-forest-labs-FLUX.1-Krea-dev.ae.safetensors | |
| RECOMMENDATION: Same folder duplicates - can safely rename extras to .dupe | |
| ACTION: Keeping file [1], renaming others to .dupe | |
| RENAMED TO BACKUP: X:\github\ComfyUI\models\vae\ae_1.sft.dupe | |
| RENAMED TO BACKUP: X:\github\ComfyUI\models\vae\black-forest-labs-FLUX.1-Krea-dev.ae.safetensors.dupe | |
| CONFIRMED DUPLICATE SET #7 | |
| -------------------------------------------------------------------------------- | |
| File size: 277680638 bytes (264.82 MB / 0.259 GB) | |
| Status: IDENTICAL FILES (same hash) | |
| Hash: 6D51A9278A1F650CFFEFC18BA53F38BF2769BF4BBFF89267822CF72945F8A38B | |
| Number of identical copies: 2 | |
| Space that can be freed: 277680638 bytes (264.82 MB / 0.259 GB) | |
| Files: | |
| - X:\github\ComfyUI\models\insightface\inswapper_128_fp16.onnx | |
| - X:\github\ComfyUI\models\roop\inswapper_128_fp16.onnx | |
| RECOMMENDATION: Cross-folder duplicates - consider hard links | |
| ACTION: Created hard links to master file [1] | |
| Master: X:\github\ComfyUI\models\insightface\inswapper_128_fp16.onnx | |
| HARD LINK: X:\github\ComfyUI\models\roop\inswapper_128_fp16.onnx -> (master: X:\github\ComfyUI\models\insightface\inswapper_128_fp16.onnx) | |
| INFO FILE: X:\github\ComfyUI\models\roop\inswapper_128_fp16.onnx.link_info.txt | |
| BACKUP: X:\github\ComfyUI\models\roop\inswapper_128_fp16.onnx.dupe | |
| CONFIRMED DUPLICATE SET #8 | |
| -------------------------------------------------------------------------------- | |
| File size: 246144352 bytes (234.74 MB / 0.229 GB) | |
| Status: IDENTICAL FILES (same hash) | |
| Hash: 893D67A23F4693ED42CDAB4CBAD7FE3E727CF59609C40DA28A46B5470F9ED082 | |
| Number of identical copies: 2 | |
| Space that can be freed: 246144352 bytes (234.74 MB / 0.229 GB) | |
| Files: | |
| - X:\github\ComfyUI\models\text_encoders\black-forest-labs-FLUX.1-Krea-dev.model.safetensors | |
| - X:\github\ComfyUI\models\text_encoders\flux-kontext.safetensors | |
| RECOMMENDATION: Same folder duplicates - can safely rename extras to .dupe | |
| ACTION: Keeping file [1], renaming others to .dupe | |
| RENAMED TO BACKUP: X:\github\ComfyUI\models\text_encoders\flux-kontext.safetensors.dupe | |
| CONFIRMED DUPLICATE SET #9 | |
| -------------------------------------------------------------------------------- | |
| File size: 246144152 bytes (234.74 MB / 0.229 GB) | |
| Status: IDENTICAL FILES (same hash) | |
| Hash: 660C6F5B1ABAE9DC498AC2D21E1347D2ABDB0CF6C0C0C8576CD796491D9A6CDD | |
| Number of identical copies: 2 | |
| Space that can be freed: 246144152 bytes (234.74 MB / 0.229 GB) | |
| Files: | |
| - X:\github\ComfyUI\models\clip\clip_l.safetensors | |
| - X:\github\ComfyUI\models\clip\flux1-dev.clip_l.safetensors | |
| RECOMMENDATION: Same folder duplicates - can safely rename extras to .dupe | |
| ACTION: Keeping file [1], renaming others to .dupe | |
| RENAMED TO BACKUP: X:\github\ComfyUI\models\clip\flux1-dev.clip_l.safetensors.dupe | |
| CONFIRMED DUPLICATE SET #10 | |
| -------------------------------------------------------------------------------- | |
| File size: 167666902 bytes (159.9 MB / 0.156 GB) | |
| Status: IDENTICAL FILES (same hash) | |
| Hash: F5B59A26851551B67AE1FE58D32E76486E1E812DEF4696A4BEA97F16604D40A3 | |
| Number of identical copies: 3 | |
| Space that can be freed: 335333804 bytes (319.8 MB / 0.312 GB) | |
| Files: | |
| - X:\github\ComfyUI\models\vae\black-forest-labs-FLUX.1-Krea-dev.diffusion_pytorch_model.safetensors | |
| - X:\github\ComfyUI\models\vae\flux-kontext-dev.diffusion_pytorch_model.safetensors | |
| - X:\github\ComfyUI\models\vae\flux1-dev.vae.safetensors | |
| RECOMMENDATION: Same folder duplicates - can safely rename extras to .dupe | |
| ACTION: Keeping file [1], renaming others to .dupe | |
| RENAMED TO BACKUP: X:\github\ComfyUI\models\vae\flux-kontext-dev.diffusion_pytorch_model.safetensors.dupe | |
| RENAMED TO BACKUP: X:\github\ComfyUI\models\vae\flux1-dev.vae.safetensors.dupe | |
| CONFIRMED DUPLICATE SET #11 | |
| -------------------------------------------------------------------------------- | |
| File size: 2006 bytes (0 MB / 0 GB) | |
| Status: IDENTICAL FILES (same hash) | |
| Hash: 9CEB1178472945273CFE1750B145505B184CAA619CE1622F4276AF416B43C9BD | |
| Number of identical copies: 2 | |
| Space that can be freed: 2006 bytes (0 MB / 0 GB) | |
| Files: | |
| - X:\github\ComfyUI\models\configs\anything_v3.yaml | |
| - X:\github\ComfyUI\models\configs\v1-inference_clip_skip_2.yaml | |
| RECOMMENDATION: Same folder duplicates - can safely rename extras to .dupe | |
| ACTION: Keeping file [1], renaming others to .dupe | |
| RENAMED TO BACKUP: X:\github\ComfyUI\models\configs\v1-inference_clip_skip_2.yaml.dupe | |
| ================================================================================ | |
| SUMMARY | |
| ================================================================================ | |
| Potential duplicate sets checked: 11 | |
| Confirmed duplicate sets: 8 | |
| False positives: 3 | |
| Total space that can be freed: 12119057038 bytes (11557.63 MB / 11.287 GB) | |
| Actions taken: 10 | |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment