It is tempting to say that there is never a good reason to disable the dedupe process. However, there are two situations that may warrant disabling data deduplication features.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
One such situation involves multi-tier deduplication, which refers to deduplicating the same data multiple times using different deduplication methods to achieve the highest possible dedupe ratio.
Global deduplication products are a good example of multi-tier deduplication. Let's say you need to back up 10 different servers running the same operating system. There is a lot of redundancy across these servers because they all have the same system files. A local, block-level dedupe process would eliminate redundancy within a server's file system. Without a second deduplication pass, however, the backup media would still contain a high degree of redundant data because the first deduplication pass only eliminates redundant data on a per-server basis. Cross-server redundancy still exists. A second deduplication pass at the backup target level would eliminate the remaining redundancy.
This type of multistep deduplication is perfectly acceptable. However, data deduplication has become so commonplace that you could have multiple products attempting to deduplicate the same data using an identical dedupe process. Performing a local, block-level deduplication on a per-server basis at the file system level and then again at the storage level would just increase overhead. At worst, it could result in data corruption.
Another possible reason to disable the dedupe process is that the associated overhead could impact system performance. If the deduplication process allows system performance to fall to an unacceptable level, then it is time to disable data deduplication.
Make the right dedupe process decisions
Survey: Deduplication process helping with backup
Choose the best dedupe backup system
Related Q&A from Brien Posey
Having a strategy to back up SAP HANA is a must. It's important to decide exactly what you'll be backing up, along with which method best suits your ...continue reading
Picking an NVMe drive is an important decision. Consider thermal control, proprietary software and drive architecture to make the right choice.continue reading
While data compression can effectively reduce space, be careful with how you use it, because the three issues outlined here could cause problems in ...continue reading
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.