Adherents believe the cloud backup services to be a lower-cost alternative to home-grown data protection strategies, an option that "empowers" users of PCs, laptops and smart clients responsible for safeguarding their own data, and a more convenient way to manage the protection of distributed data, whether located in branch offices or on mobile devices.
The exact number of cloud backup users today is unknown, and estimates of the current storage capacity of the many cloud backup providers in the market vary widely -- in part, because it is easy to confuse them with simple online storage services such as DropBox.
The potential drawbacks of online backup are generally well understood. They include:
- Slow speed with first "full" backup; subsequent backups usually include only changed or new data, a substantially lower volume of data to copy. Typically, low-capacity Internet links are used to move data (it may be useful to keep in mind that transferring 10 TB over a T-1 facility requires about a year).
- Slow speed on data restore, made worse in situations where an interruption event has a broad geographical footprint (think hurricane) and may impact many customers. Bandwidth constraints become apparent when all consumers ask for their data back at the same time.
- Lack of control over data while in the hands of the service provider (accessibility and other service-level metrics are frequently compromised).
- Security concerns regarding data stored in a public service accessed by a public network.
- Lack of control over data protection, when it is left to the whims of users.
The successful and strategic use of cloud backup services requires that these issues be confronted in a sensible way and that the service be utilized as part of a broader defense-in-depth strategy, rather than as a one-size-fits-all panacea for data protection.
Using cloud backup services as part of "defense in depth"
Step 1: Know your data
Before backing up anything to a cloud, know what is being backed up. As much as 70% of data currently occupying the hard disk of the average PC, notebook or small server consists of rarely-if-ever accessed files. These can be identified with storage resource management tools by running reports on file metadata such as DATE LAST ACCESSED/DATE LAST MODIFIED. On average, about 40% of this inert data is important, but archival -- meaning it needs to be retained, but it is never accessed. Segregating that data out for less-frequent backup and practicing common-sense data hygiene to minimize the other 30% (mostly dupes and dreck) can go a long way toward reducing the amount of data to be backed up to a more manageable size. That is important, given the comparatively slow speeds of networks connecting PC and mobile computing devices to the Internet.
You may also want to do some analysis to see what security requirements might be associated with the data you are backing up. In addition to federal regulatory mandates in certain verticals (healthcare, finance, and the like), many states also have requirements to encrypt any consumer data that is moving outside of the corporate premise, and recognizing this requirement is key to using cloud backup services correctly. While the vendor brochure may suggest that the service provider is merely an extension or augmentation of your existing IT environment, the truth is that the provider is an external agent and does not share in the legal liability that your firm shoulders in terms of regulatory and legal requirements for data protection, preservation and privacy.
Step 2: Know your service provider
Cloud backup services have proliferated in the past couple of years, in part because of their appeal to individual consumers and to small businesses (small office home office or SOHO). Unfortunately, there is little oversight regarding the claims of vendors about their facilities, operations or service levels.
It is important that users discover everything they can about a service provider, about the physical location of the service, its technical capabilities and its quality of service from the perspective of current users. Many cloud backup services claim that they are in "tier one" data centers, that they have multiple physical sites and that they protect customer data by replicating it to additional locations. All of these claims need to be examined before signing up or entrusting data to the provider.
At a minimum, ensure that the vendor has some means -- via tape or portable disk -- to copy and return your data to you if you need it. Depending on how much data you have stored in the online repository, a full restoration across a WAN link within an acceptable timeframe may be impossible.
Step 3: Monitor, validate, test
Key to any data replication scheme is the ability to monitor copy operations, validate data copies and test data restore. It is one thing to "empower" the user to perform his or her own backups with a service, but it is another thing to believe that doing so gets IT or continuity planners "off the hook" for data protection. In most organizations, someone needs to be responsible for overseeing the process: for ensuring that the right data is being copied, and that user devices are properly configured to automate the process to the greatest possible degree. Beyond that, someone needs to validate that copies are being made successfully. Recent changes in the Windows operating system enable changed files to be copied to a remote drive or cloud, but these transfers are postponed automatically if the system is busy when transfers are scheduled. Oversight is required to ensure that important data is not sitting in a queue waiting to be replicated to a service target volume.
It is also critical to test the mechanisms for data restore that the service provider offers. Such a restore test should be made at least once every month or two. Restore a disk, folder and/or file(s) to ensure their usability.
Cloud backup services, properly managed, can be a great benefit not only to the small shop, but also to larger firms with many branch offices, or those with a large mobile or work-at-home workforce, or that need to host protected data assets at greater distances from primary IT environments than budgets for private replication schemes and costly long haul WANs will permit. An enterprise-class service will provide a dashboard for monitoring and validation and may offer an option to failover to hosted backup data directly should a disaster impact local storage.
Interestingly, tape is finding new inroads into remote cloud data hosting, archive and backup. Offerings like FujiFilm's Permivault are using tape media to store cloud backup data, whether as output from backup software or in raw file form, instantiated as a file share using the Linear Tape File System. This is an important technology option to watch since the economies of scale possible with tape improve the economic chances of the service provider's long-term success.
About the author:
Jon Toigo is CEO and managing principal of Toigo Partners International.
This was first published in May 2013