Our test bed consisted of two servers connected to a QLogic 5200 SANBox 5200 Fibre Channel switch and to a Nexsan SATABlade storage system. All the Fibre Channel components were rated for 2Gbps. Each VTL system was connected to the switch and connectivit
Our test bed consisted of two servers connected to a QLogic 5200 SANBox 5200 Fibre Channel switch and to a Nexsan SATABlade storage system. All the Fibre Channel components were rated for 2Gbps. Each virtual tape library (VTL) system was connected to the switch, and connectivity to the servers was verified.
The SATABlade was configured with two 1.5TB partitions, each containing 1TB of data, and the two partitions were backed up to the VTL system using Symantec BackupExec 11d. A full backup was made of both partitions, followed by a second full backup without any changes made to the data. Next, a script changed 1,361 files totaling just over 60GB worth of changes, and then we ran a third full backup.
Backup speeds were limited by server I/O rather than the VTL. Both VTLs sustained similar speeds on each server used in testing. With a single-CPU 2.8GHz server with 2GB RAM, speeds were about 36Mbps, and with a dual-CPU 3.4GHz server with 3GB RAM, speeds were around 60Mbps.
In addition to backing up the 2TB of data, we backed up an Exchange server repeatedly while running Microsoft's LoadSim utility to simulate a large amount of traffic. Unfortunately, there is no way to vary the message size or content with LoadSim, so virtually all the messages were the same.
The result was that the Exchange Store grew from a few megabytes to 40GB over the course of two days of creating messages and doing full backups as soon as the previous backup completed, but the total size of the deduplicated store was only about 2K, times 20 million pointers.
The total size of the deduplicated store was about 50MB for both systems, indicating that the pointers to the data were many times the size of the actual data, reasonable because there were 20 million messages to point to. This shows that even when every single file being backed up is subject to deduplication (is the same as a file already stored), the backup process doesn't slow down, indicating that the deduplication function is not a bottleneck in the backup process.
Learn more about this topicStorage week in review: New storage products roundup
08/04/06Taking the best of tape and disk
Apple's iPhone 8 will likely launch in September, despite other reports to the contrary.
Microsoft removes and depreciates features in its Windows 10 Creators Update that apply to commercial...
A review of 18 companies that offer free cloud storage
Sponsored by Brocade
Three people from Illinois have sued Microsoft, claiming that the free Windows 10 upgrade they...
Developers require a powerful development environment, such as public cloud. They'll get what they need...
Catching an insider taking confidential information doesn't happen by chance, and policies and...
Old gear, bad locations and overcrowded volume top the poor choices that many companies make when it...