Tip

How to estimate your data deduplication ratio

This tip details the factors that influence data deduplication ratio (the ratio of data before deduplication to the amount of data after deduplication), so you can estimate the data deduplication ratio you can reasonably expect to achieve.

What you will learn: This tip details the factors that influence the data deduplication ratio (the ratio of data before deduplication to the amount of data after deduplication), so you can estimate the deduplication ratio you can reasonably expect to achieve.


Every data deduplication vendor claims that their product offers a certain ratio of data reduction. However, the actual data deduplication ratio can vary according to many factors, some of which are within a user's control. Below are a few variables.

Redundant data

The more redundant data you have on your servers, the higher the data deduplication ratios you can expect to achieve. If you have primarily Windows servers with similar files and/or databases, you can reasonably expect to achieve higher ratios of data deduplication. If your servers run multiple operating systems and different files and databases, expect lower data deduplication ratios.

Rate of data change

Data deduplication ratios are related to the number of changes occurring to the data. Each percentage increase in data change drops the ratio; the commonly cited 20:1 ratio is based on average data change rates of approximately 5%.

Precompressed data

Data compression is a key component in every vendor's data-reduction algorithm. Vendors base their advertised data-reduction ratios on the premise that compression will reduce already deduplicated data by a factor of 2:1. In a case where data deduplication achieves 15 times, compression could take that ratio up as high as 30:1. However, users with large amounts of data stored in compressed formats such as jpeg, mpeg or zip, aren't likely to realize the extra bump compression provides.

Data retention period

The length of time data is retained affects the data-reduction rate. For example, to achieve a data-reduction ratio of 10 times to 30 times, you may need to retain and deduplicate a single data set over a period of 20 weeks. If you don't have the capacity to store data for that long, the data-reduction rate will be lower.

Frequency of full backups

Full backups give data deduplication software a more comprehensive and granular view into the backup. The more frequently full backups occur, the higher the level of data deduplication you'll achieve. Deduplicating backup software products have a slight edge over disk libraries because they run a full server scan every time they execute a server backup, even though they only back up changes to existing files or new files. In between full backups, disk libraries usually only receive the changes sent as part of the backup software's daily incrementals or differentials.

Check out the complete text of the Storage magazine article, Catching up with deduplication.

Jerome M. Wendt is a storage analyst specializing in open-systems storage and SANs.

Dig Deeper on Data reduction and deduplication

Disaster Recovery
Storage
ITChannel
Close