R – Synchronizing Files Across Multiple Servers

synchronization

In a farm environment, is there a preferred technique for keeping files created/updated/deleted on one server in sync with all of the other servers in the farm? For example, if a file is created by a user on Server A, and that file is requested by a user on Server B, what is the best way for ensuring that the file is accessible on both servers, simultaneously? Does the same answer apply for many (1,000+) servers in a farm?

Although my particular question applies mainly to Windows servers, it is preferred that the platform make little to no difference.

Best Answer

You could use Distributed File System (DFS), which is built into the Server OS. I've done this to accomplish a similar goal.

Essentially, you configure DFS to create a root, which is really just a URI. You might create \\DOMAIN\SHARE which looks like a share, although it is virtual. DFS leverages the domain's DNS to present it as a valid location. Within the root, you might create links which are just paths to physical file shares on any number of servers. These would be the equivalent of subdirectories under your root. Finally, for each link, you can create multiple targets. In your example, it would be a share on each of the machines. DFS will then replicate files in those shares across all paths listed as targets, using the File Replication Service.

It works very well for the two servers I have it spanning. I don't know how well it would scale when replicating to 1000+ servers. It's an enterprise level solution, but I'm not sure that number of machines would be administratively viable. Because you are spanning machines, you probably wouldn't need to replicate at that scale, but rather use this as a service, like the abstraction that it is. The path is a constant.

Other caveats: you have to have the File Replication Service installed. I think you'd also need a domain environment to really make this work.

Related Topic