Tip

The role of NAS in data backup

The role of network-attached storage (NAS) devices is evolving from a data backup problem to data backup solution for the enterprise. NAS vendors are now positioned to take on a much greater role in the backup infrastructure,

Requires Free Membership to View

as vendors are blending traditional NAS architectures with virtual tape library (VTL) and data deduplication functions. To appropriately set the stage, let's first look back at the last five years of industry progress.

Early adoption: NAS and traditional backup

The typical NAS deployment includes one or multiple NAS filers, which are backed up via traditional backup software (i.e., CommVault, EMC Corp. NetWorker, IBM Corp. Tivoli Storage Manager, Symantec Corp. Veritas NetBackup, etc.). The most basic approach includes a surrogate backup host with backup client software, which mounts NAS volumes via NFS/CIFS, and copies files for backup over the network. Another approach is to leverage NDMP, which allows backup software to orchestrate backup data movement directly from the NAS device to physical or virtual tape devices, thus skipping a step in the data movement process.

Issues with a typical NAS deployment include:

  • I/O bottlenecking, often at the network level
  • Long backup windows
  • Performance impacts to users
  • Number of tape drive mount points for large NDMP backups

Benefits include:

  • Native version control by backup software
  • Centralized backup schedules and management

Full adoption: NAS and Native Copy technology

Most NAS architectures include a copy function to create and maintain multiple copies of changed files and efficiently do so by storing block-level changes in files only. This local copy function is typically complemented by replication and snapshot clones, and can provide another layer of protection for disaster recovery. Most of these implementations still include traditional backup technology, as customers rarely eliminate traditional backup from the NAS environment.

Issues include:

  • Limited visibility into NAS copy functions (monitoring and reporting capabilities are immature)
  • As environment grows, copy and replication scheduling may grow complex to manage
  • There are limits to the number of NAS copies (and versions), (e.g., archive or long-term retention)
  • Deleted data (how long do your copies exist, and can you restore deleted data?)
  • Space consumption by "copies" of deleted data (because of policies to avoid the problem above)

Benefits include:

  • Eliminated bottlenecks associated with traditional data backup
  • Efficient use of bandwidth and disk because of the block-level incremental nature of copies
  • Easy to configure and deploy
  • Backups of data can still be made from replicated volumes at a remote site using NDMP
  • Reduced data movement for backup and offsite replication

The future: NAS, with VTL interfaces and data deduplication

This is where the story gets interesting. Here's a brief market recap:

Data Domain had one of the first data deduplication products, which provided a native file-system mount point for backup software storage. As the market ripened for VTL solutions, Data Domain built VTL interfaces in addition to NAS interfaces, and has capitalized heavily on the small-midsized business (SMB) market with a simple and viable alternative to tape infrastructure for core and remote sites.

NetApp has recently made major push into the backup market by incorporating native deduplication and VTL interfaces into its NearStore product line, which is specifically targeted for the backup storage infrastructure market. The NAS implementation provides a deduplicated disk storage device, a suitable target for regular backup images or a traditional NAS implementation with the benefits of deduplication. The NearStore VTL implementation (with deduplication coming soon) provides a VTL solution for backup software, and again serves as a suitable target for backup images. This solution also includes an innovative capability to perform deduplication "inline" or "post-process" depending on workload conditions.

NEC Corp. of America is coming to market with a innovative NAS architecture with its HydraStor, which provides a grid-based globally deduplicated storage solution for backup applications. NEC's architectural approach is completely unique. Most competitors (VTLs included) adopted appliance-based systems architecture, and while simple to deploy, it has resulted in major design challenges associated with deduplication processing and performance.

Technology convergence usually results in increased competition and better products, which will ultimately be good for users, but potentially bad for early adopters. The deduplication designs of NAS devices are evolving, and vendors have yet to prove which deduplication method, architecture and overall integration model (NAS vs. VTL), will become the future industry standard. In the meantime, keep a close eye on the NAS/deduplication leaders and newcomers, as the NAS meets backup story continues to evolve.

About the author: John Merryman is Services Director for Recovery Services at GlassHouse Technologies Inc., where he's responsible for service design and delivery worldwide. Merryman often serves as a subject matter expert in data protection, technology risk and information management-related matters, including speaking engagements and publications in leading industry forums.


This was first published in April 2008

There are Comments. Add yours.

 
TIP: Want to include a code block in your comment? Use <pre> or <code> tags around the desired text. Ex: <code>insert code</code>

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Sort by: OldestNewest

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

Disclaimer: Our Tips Exchange is a forum for you to share technical advice and expertise with your peers and to learn from other enterprise IT professionals. TechTarget provides the infrastructure to facilitate this sharing of information. However, we cannot guarantee the accuracy or validity of the material submitted. You agree that your use of the Ask The Expert services and your reliance on any questions, answers, information or other materials received through this Web site is at your own risk.