Microsoft's Data Protection Manager moves toward backup, but leaves some machines unprotected
Microsoft has made its first attempt at tending to enterprisewide data protection. In our Clear Choice Test of the newly released Data Protection Manager 1.0, we found that it does a respectable job at data backup/restore, archiving and management, but it's strictly limited to Microsoft-only machines and does not yet support Microsoft's full range of applications.
Archive of Network World tests
DPM requires highly patched versions of Windows 2003 server editions as its base platform. This server becomes the nexus for most storage-management operations. Each compatible component (meaning one that is a member of an Active Directory domain and is running the Active Directory Remote Registry daemon) of a Microsoft-based network can be readily managed.
DPM cannot be used on 64-bit hardware - even if it can run in x86 emulation mode - until Microsoft releases a planned service pack, either as a protected device or as a server. DPM doesn't support numerous Windows client types, including older versions of Windows, Windows CE devices, Windows-based mobile phones and Windows Media Center.
Users of these Windows platforms must use their own primitive back-up facilities to store data onto a DPM-compatible platform to be peripherally included in the DPM platform services. The Windows platform compatibility was surprisingly immature for a product that would be purchased to address the critical issues of archiving and availability.
A more egregious omission is that DPM is unable to fully protect Microsoft Exchange . While other Microsoft and third-party methods exist to increase Exchange availability, DPM can't do comparatively simple tasks, such as taking a snapshot of Exchange, without shutting down the application.
The primary DPM server can't be an Active Directory domain controller, but must be a domain member. In turn, storage-area network components must have direct and speed-worthy links (nothing less than T-1 speeds) to this server.
Through experimentation, we found it best to build a DPM server from scratch, which takes about two hours, compared with four hours to take an existing Active Directory controller server and run it through the processes needed.
Immediately after media installation, it's time to allocate storage pools. These pools can be optical media, locally attached storage or a virtualized infrastructure, such as iSCSI -connected drives. DPM found all of the storage components that we used in our test without issue.
The software agent is pushed to each node that will be included in the DPM archiving scheme and rebooted. Clients or servers found in the Active Directory service can be included in scheduled full, incremental, snapshot or other back-up plans, depending on organizational needs. We tested each of these methods, both as incremental/"scheduled archives, as well as a number of different types of restorations. Snapshots, or the originally backed-up full data set with added periodic backups, are managed most thoroughly. This snapshot method was better than traditional file-by-file/folder-by-folder/application-by-application back-up processes, because it's easier to track, lacks user intervention and the state of a server or client is easier to restore.
Multiple concurrent backups, snapshot storage cycles and restore jobs can run at once with no detected DPM platform instability. DPM takes advantage of both hardware speed and dynamic RAM (especially in restore jobs to remote clients) and can become extremely busy. Very active organizations might consider having several DPM-based servers online if their storage activities are strenuous or frequent, as DPM can become hardware-limited in the sheer amount of work it can do. Several vendors have announced appliances based on DPM, and these might make a convenient fit for some organizations.
SQL Server 2000 aptly keeps track of what's going on and handles multiple concurrent requests for a mixture of backups and restores without errors. The limits imposed on performance were the number of network cards/link speed and the speed of the SCSI drives we used. We tested DPM in two profiles, using a direct connection to emulate a local network back-up server, and via a local VPN to emulate a branch network.
Our CPU utilization tests showed that DPM is capable of drawing at very near the wire speed of the host bus adapter (SCSI or Fibre Channel ) and the drives we used. When files change, only the deltas are backed up. For example, when a 2.2-G byte file changes by 100M bytes, about 130M bytes of data is sent to the server in the form of a snapshot back-up job. This iterative back-up method tends to make backups faster, but might require heavy rethreading of files in the case of large file loss.
We attempted to make several iterative backups of slight changes in the aforementioned 2.2-G byte file, but were unable to detect delays in the reformation/rethreading of the file when we restored the file via snapshot. Jobs are encrypted by a hash when traveling from agent machine to the DPM server, and double encrypted when we used our VPN as the transport.
The DPM GUI allowed us to heavily schedule jobs and replicate jobs for multiple users, making administration of the jobs simple. Logs are very easy to understand.
Problems with restores
Bare-metal restores were a bit more problematic. There is no method to jump-start the process, as you have to put a base Windows server operating system onto bare-metal server hardware, then authenticate, obtain the DPM agent and effect a restore job with DPM. The job of restoration to a known state requires two steps at minimum - initial operating system installation and then a link to a snapshot to restore the machine to a desired state.
In all, DPM is a decent first step, although hobbled by lack of compatibility, even with Microsoft's own applications and being confined to just recent 32-bit platforms: Windows 2003 Server editions and the Microsoft Storage Server, Win 2000 Server and Windows Powered NAS.
The upside for highly homogeneous Microsoft late-model 32-bit platforms is that the product has strong back-up and restore features for file and Web servers, as well as compatible servers. It has deceptive simplicity, and for all of the abuse we put it through, it worked without flaw, given its constrained operating environment.
Henderson is principal researcher for ExtremeLabs of Indianapolis. He can be reached at firstname.lastname@example.org. Laszlo Szenes contributed to this story.
Henderson is also a member of the Network World Lab Alliance, a cooperative of the premier reviewers in the network industry, each bringing to bear years of practical experience on every review. For more Lab Alliance information, including what it takes to become a member, go to www.networkworld.com/alliance.
With more and more workloads going to the cloud, and the top vendors being as competitive as they’ve...
Sample some of the toughest job interview questions for technology professionals, as rounded up by...
The U.S. government reportedly pays Geek Squad technicians to dig through your PC for files to give to...
Sponsored by AT&T
Sponsored by Aquantia
So far, there’s no reason to think this issue is affecting other iPhone devices.
Of the Everests that IT faces daily, identity and access management is a particular challenge. These 10...
New and dynamic authentication factors can help prevent identity theft.
A fresh round-up of venture-backed Internet of Things startups with a focus on enterprise IT.