For some reason(Happened before I started working on this project)- my client's website has 2 duplicates of every single file. Effectively tripling the size of the site.
The files look much like this:
wp-comments-post.php | 3,982 bytes
wp-comments-post (john smith's conflicted copy 2012-01-12).php | 3,982 bytes
wp-comments-post (JohnSmith's conflicted copy 2012-01-14).php | 3,982 bytes
The hosting that the website is on has no access to bash or SSH.
In your opinion, what would be the easiest way to delete these duplicate files that would take the least time?
Best Answer
I wrote a duplicates finder script in PowerShell using WinSCP .NET assembly.
Up to date and enhanced version of this script is now available as WinSCP extension
Find duplicate files in SFTP/FTP server.
The script first iterates a remote directory tree and looks for files with the same size. When it finds any, it by default downloads the files and compares them locally.
If you know, that the server supports a protocol extension for calculating checksums, you can improve the script efficiency by adding the
-remoteChecksumAlg
switch, to make the script ask the server for the checksum, sparing the file download.The script is:
(I'm the author of WinSCP)