ORLANDO, Fla. -- A sales slump beginning in 2014 left Commvault in danger of becoming pigeonholed as a legacy backup vendor in an era of cloud and other evolving data protection technologies. Commvault adjusted by moving beyond its traditional market with a broader data management platform, and is now back on track with its financials. CEO Bob Hammer discussed the changes this week at the first Commvault Go user conference. In a talk with reporters, Hammer spoke about Commvault's Data Platform, how data should be protected in the cloud, and why he thinks Commvault is ahead of Veritas Technologies despite its rival's similar strategy.
Your keynote focused on Commvault's vision for its data management platform. What parts exist and what parts need to be built?
Hammer: Everything I talked about is in the market except for the business process automation and analytics. Most of the capabilities have been in the market a long time. We've just enhanced it. We have opened it up and made it easy to federate between the clouds over the last year.
Veritas recently came out with its version of a data management platform. How is it different from what Commvault is proposing?
Hammer: The difference is we have been building this now for 18 years. Veritas built their company through acquisitions and they are talking about building something like this in two years. You've got vision and you've got reality. Ours is reality. It is here. What Veritas' CEO was talking about is a vision. Our message is you don't have to wait for that vision. We already have it and it's well-matured and it's highly functional.
It's not an easy thing to do. You have to tie 20 or 30 things into a platform this big and make sure it all works. It took us 10 years to build an automated test environment to deliver something like this. It's not just the code. It's all the integrated testing work that needs to be done. These platforms now have massive amounts of open source code that has to be integrated and tested. Then you have to build the support structure and professional service structure to enable customers to architect these systems and support them. It's not so simple.
In your experience in the market, how many customers are still backing up to on-premises systems versus backing up on cloud systems?
Hammer: I would say 100% of the customers I'm having conversations with say they are doing something in the cloud or planning to do something with the cloud. There are a few areas where the customers are more tied to on-premises systems. You see that more in healthcare. You see some government agencies that want their own cloud. They don't want it outside their cloud. Now, AWS and Microsoft are building lockdown data centers for some of these government agencies to enable that to happen.
Veritas CEO Bill Coleman said that backups to the cloud are not true, traditional backups. They're snapshots. Is the creation of more traditional backups to the cloud the next area of development?
Hammer: The issue is moving massive amounts of data over the network, and you can't do it with the older techniques of streaming backup. It just does not work. So you have to do it with images or snapshots. But with snapshots, you are taking a picture and replicating it. If it gets corrupted, what I just replicated is a corrupted image. What you have to be able to do is marry replication and backup so we can move massive blocks of data and reassemble them into point-in-time file copies that will lock down. We started doing this several years ago.
You talked about how dynamic indexing is a key piece of Commvault's holistic data management platform. How does that indexing help customers get value from their data?
Hammer: You have to be able to put an unlimited number of attributes in a data object. A simple index with a limited number of static attributes will just not work. You need a rich, broad, dynamic and scalable index to collect, modify, organize, access and easily serve up data objects. Your index needs to be able to index content information that resides in that data object. Image copies are dumb copies. It's only relevant to the source, assuming the original source exists and assuming it didn't get corrupted. You can't see inside image copies. Ultimately, it's not as usable because image copies lack the intelligence that a rich index brings. They cannot describe the data object in more than one way.
No 'one-size-fits-most' strategy in data management
Backup and recovery market addresses copy management needs
Manage your data appropriately as its volume increases