Jim Simon, Senior Director of Marketing, APJ for Quantum
Data&StorageAsean: What is the difference between deduplication and other Data Reduction technologies such as compression?
Jim Simon: Deduplication, particularly variable deduplication, is able to find redundancies at the block level thereby achieving data reduction as high as 90%. The actual data reduction is, of course, dependent upon the nature of the data itself. Quantum’s variable deduplication algorithm is patented. Compression utilities use different, less efficient algorithms.
Data&StorageAsean: Why do the use cases we see for deduplication seem to be limited to backup appliances and all-flash arrays?
Jim Simon: Backup appliances often have redundant data because backups are made of files that are similar but different. For example, making the smallest change to a presentation file results in a new file according to a typical backup application. Deduplication, however, looks at the data at the block level and recognizes that nearly all of the blocks are identical between the files so deduplication will only back up the changed blocks. When you multiply this across an organization, the potential savings are immense.
Data&StorageAsean: Are there different approaches to deduplication and if so what are the benefits and downsides of each?
Jim Simon: The answer to this can challenge even the greatest mathematical minds! Probably the two main areas for consideration are variable deduplication versus fixed deduplication and the size of the data sets.
Variable deduplication, as its name would suggest, adjusts on the fly the size of blocks that are being deduplicated. This makes it the most efficient approach. Fixed deduplication compares fixed block sizes so there may not be duplicate blocks within the fixed parameters.
The larger the data set that a deduplication engine can examine, the greater the potential for identifying data to be deduplicated. To oversimplify, if there are identical files on two different servers, the best disk-based backup appliance will be able to recognize the duplicity and make just one copy. “Islands” of backup appliances can only examine their own data so their comparative data sets are limited.
Data&StorageAsean: Is deduplication technology relevant as companies virtualise and cloud enable?
Jim Simon: Absolutely!
Virtualized servers work just as well as traditional servers from Quantum’s DXi backup appliance perspective. In fact, our DXi-series is optimized to work with Veeam software since it is an industry standard for virtualized environments. Many customers do not want to risk sending their data to the cloud. In their case, onsite deduplication is critical.
For those who do want to leverage the cloud, Quantum’s DXi-series of backup appliances are cloud enabled. We can do a virtual deduplication of data on premise and auto-store the backup in the cloud as a service to our customers. Even better is to deduplicate on site, store the deduplicated backup on site and automatically send a second copy to the cloud as a service. This is in line with the 3-2-1 best practice (3 copies of data, on 2 different media, 1 of which is offsite for disaster recovery).
Data&StorageAsean: Are there any unique features you would like to share about your own deduplication offerings?
Jim Simon: Quantum is the pioneer in deduplication having been awarded key patents nearly two decades ago. Our DXi-series (DXi-V, 4700, 6900, 6900-S) meet the needs of SMBs up to the most demanding enterprise/government environments. Our experience enables us to provide the highest performance in the smallest footprint, often also at the lowest cost per TB stored. Whether our customers choose us because they prefer best-in-class solutions or best value solutions, they can be confident of our pre-sales, installation, integration, and post-sales support. Our website, www.quantum.com has not only product information but a number of public customer success stories (ie case studies).