Backups are typically performed during off hours (evenings and weekends) with few personnel available, so backup software often emphasizes the use of automation. This typically
LAN-free and server-free backups
Traditional backups stage data onto a server and then push the data out to a tape drive, tape library, virtual tape library, or other storage target. But this is an inefficient approach because the backup server is only utilized during the backup process, and tremendous network bandwidth is needed to transfer backup data to the target, so the LAN is almost unusable during the backup. This approach is changing by systematically moving backups off the LAN and onto the SAN, so you should understand the difference between LAN-free and server-free backups.
LAN-free backups avoid utilizing the user network by employing the SAN directly. Data is taken directly from a SAN disk, handled through an application server on the SAN and then passed directly to another SAN storage device. The Fibre Channel SAN infrastructure allows for high backup speeds up to 100 megabytes per second (MBps). Only metadata about the backup data will pass across the LAN, so the impact on LAN bandwidth is minimal. Server-free backups also avoid LAN use, but further streamline the LAN-free process by reducing the backup work performed by the application server -- ideally moving data directly between SAN locations. While the resulting backup throughput still tops out at about 100 MBps, it uses the Extended Copy command set (a set of SCSI commands not yet approved by the American National Standards Institute T10 committee) to minimize the CPU, RAM and I/O overhead on the application server.
Performance monitoring and reporting
Monitoring is an important part of the backup process -- it helps administrators understand how efficiently the backups are being executed in their particular environment. By quantifying the elements of backup performance, improvements can be implemented to optimize or streamline the process. As one example, performance monitoring might reveal better backup throughput between 2:00 a.m. and 5:00 a.m. due to lowest network utilization. This in turn might justify a shift in the backup window. Similarly, low-throughput from the backup server to the tape library may explain why excess "shoe shining" is inflating backup windows and reducing tape life. Performance results can highlight the need for network infrastructure upgrades or media changes.
Backup software should also provide comprehensive and configurable reporting features. High-level reports help management to follow overall backup statistics on a weekly or monthly basis, while low-level reporting can identify possible backup bottlenecks or media with frequent problems. Alerting is another vital feature of backup software, allowing notable events or status updates to be forwarded to corresponding IT staff. For example, an alert can indicate that a backup process failed to run properly, and immediate attention is required. Monitoring and reporting is sometimes implemented as standalone products that are separate from backup software. Backup Advisor from EMC Corp. is one such standalone product.
The IT perspective on backup is changing. Rather than performing backups simply for the sake of copying data, storage administrators are increasingly addressing backups from the standpoint of recoverability -- backed up data is useless unless it can actually be recovered. This makes backup verification and testing features more important, and any backup software should include features that simplify backup testing.
Beyond verification, organizations must practice their recovery on a regular basis. In many cases, organizations perform recovery drills by deleting unneeded "test" files that are maintained on the server, and then using the backup software to recover those files.
General purchase considerations
Backup software must be selected for its feature set and suitability for your own particular environment. However, there are some common issues to consider:
Ease-of-use. Tools that are cumbersome or overly complicated will not be used to their best potential. An IT staff should have the opportunity to test several prospective tools in a lab environment, providing comment on the feature set and user interface. Advanced features may require a modicum of training but should demand little, if any, formal training for basic features.
Compatibility. It's important for software to support the current -- and possible future -- hardware in your environment. Homogeneous environments may not be such an issue, but heterogeneous environments with a variety of hardware may prove more problematic. Network Data Management Protocol (NDMP) offers an open protocol that supports backup tasks in heterogeneous network environments.
Specialized features. Backup software will typically transfer files to tape or disk storage. However, an increasing number of software tools support data protection features like archival backups (e.g., content-addressed storage), continuous data protection, snapshots, mirroring or replication. Select backup software that complements your backup emphasis. For example, a tool like EMC's RecoverPoint allows for frequent snapshots to disk, while Symantec Corp.'s NetBackup offers general purpose tape/disk backup and restoration.
Application integration. If your goal is to support specific enterprise applications, consider the level of integration that the backup software provides for those applications. For example, EMC NetWorker software supports modules integrated with vendor-specific application programming interfaces eliminating custom script development for applications like IBM Lotus Notes/Domino, Microsoft Exchange or Sybase.