On 01/05/2011 06:47 AM, Andre Robatino wrote: > S Mathias <smathias1972 <at> yahoo.com> writes: > >> find duplicate filenames in a folder >> find | perl -ne 's!([^/]+)$!lc $1!e; print if 1 == $seen{$_}++' >> >> find duplicate filenames in a folder recursively >> ? how? > I haven't used it yet, but fdupes is probably what you want (yum install fdupes). > > Description : > FDUPES is a program for identifying duplicate files residing within specified > directories. > > > > On a related note. I use rsnapshot to perform backups, and it uses hard links to prevent duplicates, All well and good, but in some cases a backup failed for some reason, so that my directory hourly.0 no longer has files with hard links to the other directories. What I would like to do is to run a script or program to compare 2 directories (say hourly.0 and daily.0 in rsnapshot parlance), and when it finds a duplicate file, it then removes 1 and then does a hard link. An example would be: -rw-r--r-- 1 root root 970 Dec 20 09:29 hosts #file that broke the links (hourly.0) -rw-r--r-- 15 root root 970 Dec 20 09:29 hosts #original file that backed up correctly (daily.1) In the above, it indicates that possibly a previous backup did not run correctly, so rsync using the --link-dest parameter. In the above scenario, I have 2 copies of hosts that are identical, but the first one in hourly.0 was not properly hardlinked because of a previous failure. Fdupes does a good job in identifying true dups, but I would like it to go 1 step further. I could possibly just write a script. -- Jerry Feldman <gaf@xxxxxxx> Boston Linux and Unix PGP key id: 537C5846 PGP Key fingerprint: 3D1B 8377 A3C0 A5F2 ECBB CA3B 4607 4319 537C 5846
Attachment:
signature.asc
Description: OpenPGP digital signature
-- users mailing list users@xxxxxxxxxxxxxxxxxxxxxxx To unsubscribe or change subscription options: https://admin.fedoraproject.org/mailman/listinfo/users Guidelines: http://fedoraproject.org/wiki/Mailing_list_guidelines