However, the introduction of new backup software can be complex and disruptive to the enterprise. New software may still need to interoperate with old backups while remaining heterogeneous across backup storage platforms. Data reduction and encryption technologies are critically important in today's backups. The move to server virtualization can dramatically improve server utilization, so supporting virtualization is also important, but this can often overtax servers during the backup process. This section of our Backup Best Practices Guide offers a series of best practices that can help avoid common implementation mistakes and get the most from new backup software.
Shift away from tape technologies if possible
While tape storage remains a viable backup technology, the combined pressures of increased backup volumes, shorter recovery points, and poor media reliability are pushing tape out of the backup mainstream in favor of disk-based storage technologies. New backup software should support disk storage systems. The easiest transition is to select backup software that supports virtual tape libraries (VTLs), allowing the disk system to emulate tape. This embraces the speed and reliability of disk, while maintaining the backup policies and procedures established for a tape environment. Other organizations may opt for a more dramatic shift to other backup schemes such as snapshots, remote replication, continuous data protection (CDP), or other types of disk-based data protection -- each approach using specialized backup software.
Establish a plan to accommodate previous backups
Backups carry retention requirements, so any change to the backup software must involve a careful consideration of the impact on previous backups. Ideally, the new software should support existing backup volumes, but that's often impractical when there's a significant change in backup types. For example, upgrading to a new software version from the same software maker may extend support for older backups. But more significant changes, such as shifting from tape to disk architecture or moving from backup volumes to snapshots, may cause incompatibility with prior backups. When the new software does not support previous backups directly, you'll need to make some accommodations for the older media. Some organizations opt to maintain the existing backup server, software and storage platform while running the new software in tandem with a different server and storage system. Eventually the old backup system can be decommissioned when retention periods expire on the old backups.
Implement data reduction technologies wherever possible
Many enterprises are seeing data volumes increase at rates up to 60% per year, yet the demands of 24/7 operation are shrinking backup windows and recovery points. Data reduction technologies are essential tools to maintain manageable backup windows, so any new backup software should absolutely include a suite of data reduction features. Conventional compression is still used, offering modest data reductions up to 2-to-1. Deduplication saves only a single iteration of any unique file or block to disk, and can offer effective compression up to 50-to-1 over time. Delta differencing saves only changes made to data since the last backup operation, and can be accomplished in only a small fraction of space (and time) needed for a full backup.
Take the opportunity to optimize the backup set
While backups must be created and maintained, the data included in a backup set should be evaluated closely. Traditional "full backups" included every file, application and operating system. However, this approach is losing favor in the data center. As enterprises work to understand and classify data according to its relative business value, it's becoming apparent that not all data must be included in every backup. For example, data types like databases may be critical to every enterprise backup cycle, while MP3 and PPT files might have little (if any) value -- and can be excluded from the backup set entirely. Any backup software update or upgrade is a perfect opportunity to re-evaluate the value of data and optimize the backup sets accordingly.
Move toward granular restoration
Recovery processes can be just as challenging as backup creation. Traditionally data has been "packaged" into a unique backup volume format which is then saved to media like tape or disk. When data loss occurs, the backup file must first be restored, and then the lost data can be recovered. It can be a cumbersome and time-consuming process that requires extensive intervention from the storage or backup administrator. Move toward backup types and software tools that support granular file recovery. For example, snapshots support individual file recovery.
Use encryption to protect both local and remote backups
Compliance obligations also include the responsibility of security, so an enterprise must take steps to safeguard sensitive or "personally identifiable" data contained in the backup set. This usually involves the use of encryption in the backup software, though some organizations prefer to use a dedicated encryption appliance deployed outside of the backup server or tape drives that can encrypt backup streams. Encryption gained notoriety as a security measure for tape backups sent offsite, but security is just as important for local and remote disk-based backups.
Plan backup support for virtual environments
Backing up virtual machine content like VMDK (VMware's Virtual Machine Disk) or VHD (Microsoft's Virtual Hard Drive) files presents a unique challenge for backup tools -- the virtual machine must be quiesced so that no changes take place until the backup is complete. Snapshot technology can also copy/replicate a virtual machine file, but this often requires that the virtual server be shut down first. It's important to note that backups do not need to include the entire virtual machine file. But, in the event of a restoration, the virtual server must be restored before the backup could be restored. Capturing the entire virtual machine image can greatly simplify the restoration process, especially during disaster recovery.
Rather than employ general-purpose backup software to protect virtual machines, many organizations are opting for vendor-specific virtualization backup tools such as VMware Consolidated Backup (VCB) or Microsoft's Virtual Machine Manager (VMM). These tools interface directly with their respective virtualization platform, and are designed to capture point-in-time snapshots of the entire VMDK (VMware's Virtual Machine Disk) or VHD (Microsoft's Virtual Hard Drive) or other virtual machine file. Virtual server backup tools can capture the entire virtual machine state to disk without the need to quiesce the virtual machine or take it offline.
Don't overtax virtual machines with backup tasks
When selecting new software for backups in a virtualized environment, pay particular attention to virtual server utilization. The goal of server virtualization is to improve the utilization of CPU cycles, memory, and I/O resources on the server by breaking the physical machine into multiple virtual machines. It is possible to have numerous virtual machines running on the same physical server. However, backups are resource-intensive tasks, so running backups on virtual machines may compromise performance. When planning backups on virtual machines, it's best practice to leave some server capacity unused to accommodate backups. It's also recommended to stagger the backup process so that only one virtual machine on a server is backed up at any given time.
Cost is another consideration. Backup software is often licensed based on the number of machines that the software is installed on. Since a virtualized environment can have many more machines in operation, the total license costs can be much higher because the backup software will need to be installed on every virtual machine.
This was first published in April 2008