On Tue, 2005-10-04 at 18:19 +1000, Nick Bishop wrote: > This won't thread properly, coz I'm replying from a > digest. > > > Is there anyway I can now which files are > > duplicated in some directories? > > One approach is to use find, then use md5sum, then use > sort on the output of md5sum, then look for duplicate > md5's. > > $ find /home -type f -print0 \ > | xargs -0 md5sum \ > | sort ... \ > | less Not a bad idea. Only thing with this good idea is it needs more of a script to actually look for the duplicate md5sum. A huge directory will most definitely have an issue. -- Ow Mun Heng Gentoo/Linux on DELL D600 1.4Ghz 1.5GB RAM 98% Microsoft(tm) Free!! Neuromancer 09:54:41 up 2:26, 4 users, load average: 0.08, 0.20, 0.51