From filesystem-organiser
Identify and handle duplicate files on the Google Drive.
npx claudepluginhub danielrosehill/claude-code-plugins --plugin filesystem-organiser# Scan for Duplicate Files Identify and handle duplicate files on the Google Drive. ## Instructions ### Step 1: Verify mount ### Step 2: Determine scan scope Ask user: 1. **Full drive scan** - Check entire drive (may take a long time) 2. **Specific folder** - Focus on one folder tree 3. **Quick scan** - Only check likely duplicate locations (Inbox, Downloads, etc.) ### Step 3: Choose detection method Ask user preference: 1. **By name** - Files with identical names 2. **By size** - Files with identical sizes (fast, less accurate) 3. **By hash** - MD5/SHA hash comparison (slow, most ...
Share bugs, ideas, or general feedback.
Identify and handle duplicate files on the Google Drive.
findmnt {mount_path}
Ask user:
Ask user preference:
By name (same filename in different locations):
find {mount_path} -type f -exec basename {} \; | sort | uniq -d > /tmp/dup_names.txt
# Then find full paths for each duplicate name
By size:
find {mount_path} -type f -printf '%s %p\n' | sort -n | uniq -D -w 15
By hash (for smaller sets of potential duplicates):
md5sum "{file1}" "{file2}"
Note: Avoid running hash on entire drive via rclone - it will download every file.
Using rclone dedupe (dry-run first!):
rclone dedupe --dry-run {remote}: --dedupe-mode list
## Duplicate Scan Results
Scan type: {method}
Scope: {folder or full drive}
Time taken: {duration}
### Duplicate Groups Found: {count}
#### Group 1: {filename}
Total size wasted: {size}
| Location | Size | Modified | Keep? |
|----------|------|----------|-------|
| /path/to/file1.jpg | 2.5 MB | 2024-01-15 | ✓ |
| /path/to/file2.jpg | 2.5 MB | 2024-01-10 | |
| /path/to/file3.jpg | 2.5 MB | 2023-12-01 | |
#### Group 2: ...
For each duplicate group, offer options per CLAUDE.md duplicates.strategy:
99-System/Duplicates/For each confirmed action:
# Move to duplicates folder (safer than delete)
mv "{duplicate_path}" "{mount_path}/99-System/Duplicates/"
# Or delete (if user confirms)
rm "{duplicate_path}"
## Duplicate Cleanup Summary
### Actions Taken
- Duplicate groups processed: {X}
- Files removed/moved: {X}
- Space recovered: {X} GB
- Files kept: {X}
### Skipped
- Groups skipped by user: {X}
- Files with errors: {X}
### Duplicates Folder
New files in 99-System/Duplicates/: {X}
(Review and delete manually when ready)
Write report to logs/operations/duplicates_{date}.md