C
ChUIMonster
Guest
IMHO archiving, per se, is a fool's errand that is almost always a hold-over from days of yore when storage constraints were much different than they are today. You either need the data or you don't. Your data retention policies (which should be largely driven by legal requirements) should define how much data to keep and for how long. "All of it, forever" is hardly ever the right policy and can lead to serious legal problems. As well as expenses incurred having to search it or make it available in the event of legal action. So you probably need a well defined purge policy and process a lot more than you need an archive strategy. If you think you have performance problems due to volume of data you much more likely actually have performance problems due to poor indexing or poorly written queries. In a few cases you could legitimately argue that the "working set" of data is distinct from that which you would archive and that some operations, like backup and restore, take substantially longer because you are carrying the "dead weight" of that old data. But you mention that you have workgroup databases so I am skeptical that that situation applies here. Aside from the above -- one huge problem that always occurs when people archive old data is that they fail to keep the old code that knew how to operate on that data. They either assume that the code will never change or that the schema will never change and then, very quickly, whatever code they are (probably not actually) using to access the archived data can no longer make sense of it. Which they eventually find out years down the road when that unexpected legal demand for 10 year old data is served. In my experience people who successfully "archive" old data generally do it by exporting the important bits to a an external systems (such as a data warehouse) and aggressively pruning the live working set.
Continue reading...
Continue reading...