tashatuvango - Fotolia
A first look at data backup trends today can elicit comments like "boring" or "a necessary pain," but the reality is far from that. While it's true that traditional in-house backup had plateaued in innovation, becoming a backwater in the data center, the cloud is changing all of that. As a new vehicle for stored backups and a way to combine backup and disaster recovery, the public cloud offers a good deal of convenience, economic value and flexibility over disk- or tape-based offerings.
According to Gartner, approximately 10% of enterprises use the cloud as a backup target. This number may seem low, but within three years, it is expected to double. Archiving flourishes in the cloud, bringing us to one of 2017's bigger data backup trends: dropping traditional backup in favor of archiving in the cloud.
Backup and archiving today
To understand these data backup trends, we have to consider what backup and archiving are intended to achieve, coupled with the changes to the risk environment occurring today. Backup is intended to protect hot, active data from threats, such as hackers and system failures, and especially to allow recovery from a myriad of operator errors, such as erasing a file or directory. Archiving is the parking of cold, inactive data in the cheapest storage media available just in case it needs to be resurrected for a lawsuit or other reason.
One underlying trend in converging backup and archiving is that we are cataloging data more effectively, but at the same time, the hot-cold model is becoming more of a spectrum than a binary choice. This has led us to look at a one-stop-shop approach to use the same solution to achieve both ends. An enabler for this is the rise of snapshot technology as a way to protect active data.
Snapshots essentially create perpetual storage. Old versions of data aren't erased. Pointers are altered to go to the changed data, but the old pointers can quickly be recovered to allow an earlier version of a storage pool to be accessed. This is all well and good, but tape backups had a major advantage over disk-based and snapshot products. Tapes could be sent to a salt mine, so a rootkit or ransomware attack would have no chance of clobbering both primary and backup data.
With care, a snapshot can be replicated in near-real time to the cloud, but it still shares access methods to primary data and so is still vulnerable. If primary data is destroyed, the replica snapshot is gone, too. The answer is to use a continuous backup approach, which is similar to a snapshot but the data is encrypted and keyed differently from the primary copy.
Until native snapshot tools address the issue of an independent replica, backup remains very necessary, especially as the current rate of doubling in successful ransomware attacks is every three years. This will limit the growth of self-protecting storage systems for a year or so, until their software fully addresses this problem.
A competitive data backup market ahead
The winners and losers in the backup game right now might surprise you. While the largest player remains Veritas, Veeam is growing fast and looks like a very serious challenger for the top spot in 2018 or 2019. Veeam has a current-generation code platform, and the resulting app looks fresh and is nimble both in use and in feature growth.
This is true of a bunch of new players, particularly Actifio, but a notable up-and-coming newbie is Rubrik, which offers a data management system based on extended metadata. All of the newbies are cloud-ready, a critical box-tick for any enterprise contemplating hybrid cloud structures for their IT future. Likely, too, are cost-effective licensing practices, making these newcomers, as a group, the stars of 2017.
At the other end of the spectrum, Hewlett Packard Enterprise is exiting the backup business as it sells off a good part of its software portfolio to Micro Focus. It joins Arcserve and Unitrends in lagging behind in innovation in the cloud area.
We've covered some of the data backup trends in 2018, but what else will happen? The public cloud is moving in on the data center. All of the big three cloud service providers are looking at the private segment of a hybrid cloud deployment to homogenize data access and instance deployment. Microsoft Azure is the furthest along and will deliver a private/public product in the first half of 2018. This is a major challenge for backup vendors, which need to quickly match market reality to what will indubitably be popular products and be able to handle multiple segments of cloud seamlessly. Restore is likely the biggest challenge.
Another trend will develop out of the code bases of the newbies. With backup targets being accessible in the cloud, data analytics can go to town on both backup and archive data. Actifio is already on the road to supplying this type of service.
Analytics should also reach into billing optimization. With a complex portfolio of public cloud options, an automated approach is necessary to achieve lowest cost storage in an environment complicated by short- and long-term contracts and the cost of moving data or closing archives. In concert, global deduplication approaches will proliferate, as will encryption solutions.