Best Pratices for a Network File Share

share

So we have a file share that was started 10 years or so ago and it started off with the best intentions. But now it's gotten bloated, there's files in there that nobody know who put them there, it's hard to find information, ect ect. You probably know the problem. So what I'm wondering is what do people do in this situation. Does anyone know of a decent program that can go through a file share and find files that no body has touched? Duplicate files? Any other suggestions on cleaning this mess up?


Well the file share is windows based and it's almost over 3TB. Is there a utility out there that can do some reporting for me. We like the idea of being able to find anything older then 6 months and then taking it to archive, only problem is with a file share this big that could be really hard to do by hand.

Best Answer

We counsel Customers to "scorch the earth" and start fresh, oftentimes.

I have yet to see a good solution that works that doesn't involve have non-IT stakeholders involved. The best scenario I've seen yet is a Customer that has had management identify "stewards" of various data areas and delegated control of the AD groups that control access to those shared areas to those "stewards". That has worked really, really well, but has required some training on the part of the "stewards".

Here's what I know doesn't work:

  • Naming individual users in permissions. Use groups. Always. Every time. Without fail. Even if it's a group of one user, use a group. Job roles change, turnover happens.
  • Letting non-IT users alter permissions. You'll end up with "computer Vietnam" (the parties involved have "good" intentions, nobody can get out, and everybody loses).
  • Having too grandiose-ideas about permissions. "We want users to be able to write files here but not modify files they've already written", etc. Keep things simple.

Things that I've seen work (some well, others not-so-well):

  • Publish a "map" indicating where various data types are to be stored, typically by functional area. This is a good place to do interviews with various departments and learn how they use file shares.
  • Consider "back billing" for space usage or, at the very least, regularly publishing a "leader board" of the departmental space users.
  • Did I mention naming groups exclusively in permissions?
  • Develop a plan for data areas that "grow without bounds" to take old data "offline" or to "nearline" storage. If you allow data to grow forever it will, taking your backups with it to infinity.
  • Plan on some kind of trending for space usage and folder growth. You can use commercial tools (someone mentioned Tree Size Professional or SpaceObServer from JAM Software) or you can code something reasonable effective up yourself with a "du" program and some scripting "glue".
  • Segment file shares based on "SLA". You might consider having both a "business-critical" share that crosses departmental lines, and a "nice to have running but not critical" share. The idea is to keep the "business-critical" share segregated for backup/restore/maintenance purposes. Having to take down business to restore 2TB of files from backup, when all that was really needed to go about business was about 2GB of files, is a little silly (and I've see it happen).
Related Topic