Management isn't the first concern that comes to mind among many of the end users of cloud data backup. One of the reasons that companies outsource their data backups to cloud data backup service providers is to offload the burden of monitoring, maintaining and supporting the infrastructure. But, tracking key metrics can be helpful, and users are seeing improving options depending on their cloud backup providers.
MANAGING CLOUD BACKUP TABLE OF CONTENTS
Managing cloud backup often starts with a Web-based graphical user interface (GUI) that provides a single-pane or dashboard view showing such information as which backups were successful, where the files are located and how much space was consumed. Users also generally get basic reporting capabilities and can keep tabs on billing and costs.
"Not unlike how you manage data storage in your on-premise environment, you've got to have some visibility into when you might have to add additional capacity to meet your needs," said Lauren Whitehouse, a senior analyst at Enterprise Strategy Group in Milford, Mass.
Walter Petruska, information security officer at the University of San Francisco, said he also sets user quotas for storage space, creates policies to govern the types and frequency of data collection and generates graphs and reports to show how individuals or groups are using the cloud service, when pre-set policies aren't met and how much money is being spent.
Petruska said the Web GUI of the university's EMC Corp. Mozy cloud service -- which backs up data from select departmental file servers and end users' personal computers, through MozyPro client software -- allows him to define a hierarchy of users and assign permissions at a granular level. He can, for instance, give help desk staffers the ability to view a user's backup history and increase his or her quota, if necessary.
"I only spend maybe 10 minutes a week working on things that get escalated to me," said Petruska. "If we were running it ourselves and it wasn't a service, there would have to be a dedicated high-level system administrator."
Petruska noted that he does not, however, permit the help desk to recover a user's data or destroy backups. End users can restore files on their own, through Web interface or a tool, or via overnight mail on some form of media, if they need to recover a particularly large amount of data. Petruska said he is able to track the files that were restored, the date, the IP address of the computer and the requester.
Rogers Towers P.A., a Jacksonville, Fla.-based law firm employing more than 100, relies on a server-based console to manage its backups and monitor its cloud usage, which stands at about 700 GB for active data and 100 GB for archive. Based on technology from Asigra Inc., the system copies data to local drives in the office and then synchronizes with the cloud, providing disaster recovery and "piece of mind" in the hurricane-prone region, according to Kevin Rorabaugh, the law firm's IT director.
Yotta280, the firm's local service provider, set up the system, including the backup schedule, the permission levels for administrators, and the log notifications. When a local backup is no longer needed, the IT staff can move it to archive or remove it, Rorabaugh said.
"The old fashioned way, I had a person who had to pop tapes out or cartridges in, doing backups at night," on a manual basis, said Rorabaugh. That individual had to be available to tell staffers from where to restore, whereas under the new system, multiple IT staffers have that capability, he added. "I didn't want us to have to rely on one individual to be responsible for knowing what the status of our data is."
ESG's Whitehouse said it's not uncommon to have off-the-shelf data backup software, with its incumbent management features, enabling a cloud service. Some systems synchronize byte for byte between the local machine and the cloud, while others allow users to set policies for what is stored locally and what is stored in the cloud, she said. Either way, on-site appliances can come in handle for recovery in the event of an outage.
Steve Brasen, a principal analyst at Enterprise Management Associates Inc. in Boulder, Colo., said users should look at two different types of tools to manage their cloud backup or cloud storage environments. The first type provides the ability to migrate data to and from the cloud. That includes traditional backup, migration and mirroring tools that have been extended to support the cloud, storage management software and specialized software for especially complex environments.
The second set of tools monitors the cloud environment to make sure it's secure and performing correctly. Brasen cited examples from Hyperic Inc., which does not focus solely on storage, and Symantec Corp., which last month announced integration of its Veritas Storage Foundation Basic management software with Amazon.com Inc.'s Elastic Compute Cloud (EC2) technology.
Users who hope to use their traditional backup applications in conjunction with a cloud service need to make sure the vendor supports it, since the application vendors must write to the differing application programming interfaces (APIs) of the cloud providers.
Another access methodology is through file-system emulators, in which a vendor makes the cloud storage look like a network-attached storage (NAS) system, added Adam Couture, a principal research analyst at Gartner Inc.
"The application sees this CIFS or NFS file system and just writes to it," Couture said. "It's really doing some emulation of the protocols and kind of spoofing the server to think it's something else. That way, you don't have to write to the API. You can do a backup to what the server thinks is a native file system."
Some enterprise data storage industry analysts said they expect vendors to offer additional tools and capabilities to enable users to manage their cloud and non-cloud storage through the same console. Couture said tools from vendors such as those from EMC and Symantec could move further in that direction this year.
CA Inc., for instance, plans to improve centralized management in its ARCserve backup and recovery application to enable users to manage both their cloud and non-cloud environments, according to Don Kleinschnitz, senior vice president of software engineering for CA's recovery management business unit. He said CA is expanding its architecture to "provide the management, movement and data protection elements that people need to operate in the cloud."
"As we have looked at the whole cloud problem, we believe that this is more than just adding a feature into a current backup product to properly allow our customers to manage this environment," Kleinschnitz said. He added that CA intends to permit customers to use its technology with any cloud. Plans call for products to support the clouds that are most important to customers, such as Amazon and Rackspace Hosting Inc., he said.
One management option for private clouds is Symantec's approach with respect to its own online backup services. Sean Derrington, director of storage management and high availability at Symantec, said the vendor uses its File Store and Veritas Storage Foundation for Windows to manage the backup services. He noted that customers could use the same technology to manage private clouds.
Commenting on managing in-house and cloud backups, Brasen said, "The idea is to create a dynamic infrastructure file system so that you can migrate data back and forth from external clouds, internal clouds and regular server file systems and seamlessly do that. So, you can extend from one file system to the other and transfer data or mirror data or archive data to any of these resources. That's sort of the holy grail."