Many aspects about selecting data backup tools have stayed consistent in the past few years. Enterprise data storage and backup administrators should always understand what they are backing up and how specific data needs to be protected before choosing a data backup solution. Also, you need to understand data security issues and the value of encryption.
However, several things have changed when considering data backup tools and applications in recent years. Vendors have started to combine more features into one product, and specific data backup features are emerging as must-haves in a storage solution. This article explores what's changing in data backup solutions and how to choose the right one for your data backup environment.
Changes in choosing data backup applications
Mike Karp, founder and principal analyst at Infrastructure Analytics, said the changes in choosing backup applications mainly boils down to the fact that customers want to have choices. And since many vendors have enhanced their product offerings, it is now possible for customers to find a complete spectrum of products tailored for the specific kind of data backup they want. However, instead of assembling the capabilities from multiple vendors, it is now increasingly possible to get it all in one package, from one vendor. "More and more products have built in choices like continuous data protection [CDP] and you can choose from a menu or at least have the option of buying an extra module," Karp said. Some of the other features that have begun to appear as built-in backup options include centralized storage policy administration, messaging compliance to align corporate policies with regulatory requirements and special modules to address the needs of remote offices, said Karp.
Features in data backup applications growing
To some extent, what buyers want may be changing, too. For instance, Karp said business expectations have made disk-to-disk (D2D) backups more crucial to buyers than in the past. Karp said CommVault offers D2D capabilities, and BakBone Software Inc., EMC Corp./Data Domain, IBM, Overland Storage, and many other vendors also offer D2D backup. "This is really important because D2D means you can recover faster and more reliably," said Karp.
Lauren Whitehouse, an analyst at Enterprise Strategy Group (ESG), agrees that data backup choices are growing. Her top "must have" data backup feature is data deduplication. Deduplication can be a flexible process. Depending on the goal, technology and vendor, dedupe can occur at the source inline, or on the target side/post-process, and vendors often offer one type of deduplication with their data backup software. In addition, Whitehouse said some backup vendors include deduplication features for free in their data backup software. These vendors include Asigra Inc. Cloud Backup (inline), CA ARCserve (inline), EMC's Avamar (inline) and IBM Tivoli Storage Manager (post-process).
Other vendors include dedupe as an option within their standard backup software for an additional fee. Examples include Acronis Backup and Recovery (inline), Atempo Inc. Time Navigator (inline), BakBone Software Inc. NetVault (post-process), CommVault Simpana (inline), Symantec Corp. Backup Exec (source or inline) and Symantec Veritas NetBackup (inline).
Whitehouse said implementing deduplication can potentially streamline the whole backup process, "particularly if it is done at the source system -- at the client -- since you are saving bandwidth in transmission as well as capacity on whatever type of disk storage you are using," she said."It is great to have and probably long overdue," she added.
Whitehouse also said ease of use in dedupe products is generally very good. "In most cases, deduplication is simply a policy that you set and then it just 'happens' as a natural part of backup," she said. As the system performs its tasks, it takes care of the management. Depending on the product, some offer a choice of where the deduplication occurs (for example, at the client or at the target storage system). "I would expect that over time there will be more flexibility in that area. We are still at the 1.0 level for these applications," she said.
According to Whitehouse, one downside associated with data deduplication is the processing it entails. "For some customers, deduplication may require adoption of more powerful hardware systems. However, you will get a payback in terms of reducing total storage costs," she said.
Whitehouse said her other favorite innovation is the Symantec NetBackup OpenStorage API, an option for NetBackup that she said will soon be available for Symantec Backup Exec. Using OpenStorage, a vendor that offers data deduplication in a storage target device can get certification with their system and NetBackup. Then, a customer purchasing the product with that option can incorporate many kinds of disk storage seamlessly within one big system. All disk storage can be treated as a generic entity. It is a great step forward in usability and provides performance benefits, too, she said.
In fact, said Whitehouse, Data Domain was able to benchmark performance improvements of around 200% when using the OpenStorage API. "It is very cool; this puts Symantec in a leadership position. It's just too bad the rest of the backup industry hasn't followed suit."
Data backup application shopping list
Determining what data backup application is the correct fit for a company depends on a few factors, including environmental fit, scalability, backup window and recovery objectives, compliance and budget. The following are a list of considerations when choosing a data backup app:
- Environmental fit: According to Whitehouse, before a user chooses a backup application, they must ask themselves, "Does it support the applications, platforms, and hardware in my environment?", and, "Does the solution scale to meet my performance and capacity needs?" Whitehouse noted that organizations need to look closely at their environment and understand what systems and applications require protection, starting with documentation of the underlying hardware (chipset, number of processors), OS (type, version), hypervisor (type, version), and application (type, version) specifications.
- Scalability: Organizations need to determine how much data has to be backed up and within what timeframe, said Whitehouse. Then they can calculate the performance requirements, such as the size of the backup window, the amount of data to be protected and the throughput of the backup application and media. "They also need to review growth rates and retention policies to determine how the environment will change over some period of time. This will be especially important if the solution has capacity-based licensing and/or disk as the target storage," said Whitehouse. In addition, the use of data deduplication could be a determining factor in both performance and capacity.
- Backup window and recovery objectives: Based on a review of backup window requirements, organizations need to determine if alternative methods of backup are required to be compliant. Specifically, are there snapshot/continuous data protection (CDP) alternative solutions or features of backup applications that are required? Finally, she said, you should determine if there is a specific type of media that must be supported. For example, some form of disk-based backup to accelerate data backup. "The most aggressive recovery time objectives (RTOs) may require add-on options, such as bare-metal recovery (BMR), system-state recovery, or whole server recovery," said Whitehouse. "Recovery objectives will also dictate the type of media that should be used." Those same things may be a factor in deciding whether cloud storage is a viable resting place for data backups.
- Compliance: According to Whitehouse, if an organization has privacy or retention mandates, then reviewing a backup solution's compliance-related features makes sense. "Encryption of data in flight and at rest, and access control to backup data are top concerns regarding privacy, [while] for retention, you must determine if the backup solution offers flexibility in retention time, and whether it has immutability features to make backup data tamper-proof," said Whitehouse.
- Budget: Budget is an important topic for many companies since many of them are going to be constrained by it, said Whitehouse. "[But] for many companies, skimping on data protection will cost them in the long run because downtime and/or data loss can have huge implications on an organization's ability to conduct business," she said.
However, when it comes to making tradeoffs, "good enough" often wins. "I think where many organizations miss the boat is in only evaluating solution cost from a CAPEX standpoint," said Whitehouse. "A solution could be inexpensive to acquire, but could cost a lot more in operational expenses." For example, she noted, it is worthwhile to consider how much operator intervention will be required and whether or not there are automation features.
With so many things to consider, "It probably makes sense to do a three-year total cost of ownership (TCO) model for different solutions," she said. Furthermore, it makes sense to do a comparison of on-premises/licensed backup software compared with either backup Software-as-a-Service (SaaS) or a hybrid option (on-premise software with off-premises cloud storage) to see the feasibility of the various approaches. "Organizations that are budget-constrained may prefer the pay-as-you-go and pay-as-you-grow approach of a backup SaaS subscription to reduce upfront CAPEX expenses," said Whitehouse.
About this author:
Alan R. Earls is a Boston-area freelance writer focused on business and technology, particularly data storage.