queeg Posted March 16, 2010 Share Posted March 16, 2010 At some point I was testing copy speeds and created some duplicate files (actually a lot of them) on different drives. So many that I need to search and fix them a little at a time or my syslog gets filled up with messages and my system locks up. Can someone help me with some sort of linux command that will search for duplicates but stop searching after finding a specified number of them? That way I can fix some and rerun the search. Quote Link to comment
unraided Posted March 16, 2010 Share Posted March 16, 2010 Hi. The links below would help you locate duplicate files under a Linux CLI: http://ajayfromiiit.wordpress.com/2009/10/16/one-liner-to-find-and-remove-duplicate-files-in-linux/ & http://www.commandlinefu.com/commands/view/3555/find-duplicate-files-based-on-size-first-then-md5-hash also there is a script which you could run on you server to locate duplicate file too (caution: untested under unraid) http://fslint.googlecode.com/svn/trunk/fslint/findup Hope this helps. Quote Link to comment
Joe L. Posted March 16, 2010 Share Posted March 16, 2010 Hi. The links below would help you locate duplicate files under a Linux CLI: http://ajayfromiiit.wordpress.com/2009/10/16/one-liner-to-find-and-remove-duplicate-files-in-linux/ & http://www.commandlinefu.com/commands/view/3555/find-duplicate-files-based-on-size-first-then-md5-hash also there is a script which you could run on you server to locate duplicate file too (caution: untested under unraid) http://fslint.googlecode.com/svn/trunk/fslint/findup Hope this helps. It works, but it uses "perl" which does not exist on unRAID. Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.