Problem solve Get help with specific problems with your technologies, process and projects.

Think small to avert big disasters

Jon Toigo says there are several approaches to dealing with the prospect of the loss of a single critical file, and outlines three of them here.

There are several approaches to dealing with the prospect of the loss of a single critical file. Here are three:

Open file backup

First, you can guard against the possibility of a non-restorable open file copy by tricking the backup process. The folks at St. Bernard Software have become experts at this one. Load their package, appropriately called Open File, and whenever you launch a backup process, the product works behind the scenes to take a picture of all files in their quiesced (that is, not opened) state. When the backup process goes to grab files for inclusion in the backup stream, it takes the quiesced data rather than any files that may happen to be open at that moment.

We have St. Bernard's product in our test labs right now and we are impressed with its performance in connection with third party tape backup and disk-to-disk-to-tape solutions like Breece Hill's iSTORa appliance. To be honest, we hadn't heard of Open File until relatively recently. This is merely because various iterations of the product have been embedded anonymously into products from Veritas and Legato. Brian Bonert, Director of OEM and Direct Sales, tells us that over 300,000 licenses have been sold since 2001, albeit often under other branding.

The latest generation of the product waits for a three second "write inactivity window" and takes a coherent data snapshot that is presented to data movers (backup software) while updates to files are being cached. That little bit of sleight of hand may be all you need to make sure that your files are retrievable when they are needed.

Incremental snapshots

A second approach to file protection is from Columbia Data Products (CDP) and goes by the name Persistent Storage Manager (PSM). Like St. Bernard's offering, PSM is sold 85-90% through OEMs. Microsoft supplied it with its server operating systems and NAS software developer's tool kit until the release last year of its own VSS Shadow Copy, which, according to CDP Vice President of Marketing, Warren Miller, is in some respects less capable than CDP's product.

PSM is for folks who just can't afford to lose data. Essentially, the software creates a cache on the server or workstation where it is installed and takes snapshots of changed data on an ongoing basis per a user-defined schedule. Typically, users allocate up to 30% of their disk space for the snapshot cache. In operation, the product quiesces applications for a fraction of a second to create its snapshot.

Miller said Microsoft does the same thing, which it calls "freeze and thaw" but only with applications that are VSS compliant.

The PSM product frees up space more efficiently than does VSS, which requires the user to delete older images serially. These advantages explain why folks who use Server 2003 are still lining up to buy PSM to augment their data protection, Miller said.

He attributes the competitive advantage of his wares to those of Redmond to a lengthier pedigree: "We started with a snapshot engine, but added more and more end user-focused functionality. You can restore overwritten and deleted files, and you can select a file and quickly obtain a list of other versions that are available."

Removable disk

A third approach to data protection that is rapidly gaining ground is different from the norm because of its use of removable disk, rather than tape. Announced just last week, Spectra Logic's RXT Portable Disk Technology is well on its way to setting a new standard. Imagine being able to use your tape library to pick removable disk too: we like this idea and the implementation that Spectra Logic has just announced.

We will delve more into the RXT philosophy in a later column. For now, all of the above products are worth a look.

Bottom line: the folks who gave us the contemporary file system were working within engineering constraints (limited disk capacity, slow processors, short bus bandwidth) that guided them to select a file data recording scheme that was self-destructive. Every time we save a file, we overwrite the last valid copy of the file that existed. It is there, in the smallest of data sets, that the potential for real disaster lurks. Best we do something today to protect the single file.

For more information:

Tip: No 'one-size-fits-all' for data protection

Tip: 10 data protection recommendations

Tip: Nine rules for better backups

About the author: Jon William Toigo has authored hundreds of articles on storage and technology along with his monthly "Toigo's Take on Storage" expert column and backup/recovery feature. He is also a frequent site contributor on the subjects of storage management, disaster recovery and enterprise storage. Toigo has authored a number of storage books, including Disaster recovery planning: Preparing for the unthinkable, 3/e. For detailed information on the nine parts of a full-fledged DR plan, see Jon's web site at

Dig Deeper on Backup and recovery software

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.