Essentials is a systems management suite for midsize businesses; Data Protection Manager helps with backup and recovery
Last month, Microsoft released the first two products in the System Center 2010 wave: the management suite System Center Essentials 2010, and the backup-and-recovery-focused Data Protection Manager 2010. I've been evaluating these products in late prerelease form, and I'll share my impressions and what I've discovered here.
System Center Essentials 2010
System Center Essentials 2010 is the latest revision to the integrated systems management suite designed for midsize businesses. Microsoft already offers the full System Center suite for enterprises but has recently focused on filling the gap that often exists between product offerings for tiny shops with fewer than 25 people and global corporations with 5,000-plus users.
Microsoft has a pretty wide definition of what a midsize business looks like: It suspects such a business has between 25 and 400 PCs and five to 50 servers, doesn't already have systems management software, has fewer than five IT generalists in its internal support operation and predominantly uses Microsoft software. (Clearly that's a model picture in Microsoft's view.) Obviously, this vision is a bit more likely in the smaller end of that space, with more heterogeneity probable in the upper end of the midsize market.
In any event, Essentials 2010 is designed to help make an environment not managed by specialists every bit as efficient and healthy as one that is. Essentials 2010 attempts to achieve three key goals:
* Monitor the total environment of a network and proactively alert administrators to, and sometimes automatically fix, problems that crop up -- anywhere from a client PC to a server to a piece of software.
* Deploy software and patches in an efficient, streamlined way, rather than sneakernetting CDs and DVDs around the office.
* Integrate virtualization strategies and techniques into a market that traditionally hasn't been ready for that complexity.
Essentials 2010's interface will be comfortable for Outlook users; the team redesigned the interface to be more fluid and expose more functionality with fewer clicks. The administrative console is easy to navigate. Thanks to the comprehensive task list that appears in the pane on the right, I didn't spend a lot of time looking for features.
Updates and software deployment
Essentials 2010 integrates Windows Server Update Services more fully with Essentials' administrative console, monitoring capabilities and deployment features.
At its core, WSUS attempts to automate the patching process as much as possible. Between Essentials and WSUS, the tools can discover which updates are required in your environment and set auto-approval deadlines for update deployment; these are the dates at which a particular update will automatically be deployed, even without an administrator's explicit approval.
Another feature is the ability to perform those update installations according to the class of machine -- workstation or server. Since patching is likely a manual process in these environments, Essentials tries to take the menial work out of the task and improve system health. In the end, it works pretty well.
Essentials also attempts to make the process of deploying software much more streamlined than running software discs around an office. It puts an attractive, easy-to-use interface around Group Policy-based software deployment and also adds some intelligence found in the suite's big brother, System Center Configuration Manager, so that pushing out Office to 150 clients, for example, doesn't take weeks.
I found the deployment wizards much simpler to understand than the native Windows Server/Active Directory tools. The additional capabilities for non-Microsoft Installer-based software packages not found natively in Windows are also welcome.
Supporting virtualization in the midmarket
The virtualization phenomenon is in full swing, and one could reasonably wonder why Essentials 2010 took as long as it did to integrate virtual machine technology into its core. Quibbling aside, in Essentials 2010 Microsoft has added virtualization hand-holding for the IT generalist: There are wizards to perform many common VM-related tasks, including building new virtual machines from scratch and importing existing VMs into the Essentials management console, with support for both Microsoft's VHD format and VMware's folders. Templates can be used when creating VMs to achieve consistent performance and "spec'ing" of your VM farms, and a tool that intelligently decides the best host for a particular VM is included as well.
Converting existing physical machines to VMs, a process known as P2V, can be handled from within the Essentials 2010 console. And once the VMs are created and placed, and the Essentials agent installed on them, they can of course be managed and monitored like any computer on the network. You can also live-migrate VMs as well (that is, move a virtual machine from one host to another without any downtime).
I think Essentials 2010 serves a purpose in this market segment. It adds some welcome touches to update deployment and removes a lot of the mystique around how to deploy virtualization technology in the middle market. It can also save staffers a lot of time through better monitoring and better software installation methods. The redesigned interface is simple and comfortable, and the product's limitations are few -- and all in all, it's appropriate for a product that's lighter than the full System Center suite.
On the other hand, it's not right for your business if you're already in the upper bounds of the projected user range (more than 400 PCs), if you run a very heterogeneous environment either on the client or on the server, or if virtualization isn't at the top of your company's to-do list.
System Center Data Protection Manager 2010
This package delivers unified data protection for Windows servers and clients in the form of backup and recovery. DPM 2010 provides strong protection and supportable restore scenarios from disk, tape and cloud -- in a scalable, reliable, manageable and cost-effective way. Admittedly, it's a heavy solution, especially for the lower and middle bounds of the midsize market. It's not a simple Windows backup system, because it requires server and client elements; also, it's not inexpensive, and it requires some time to deploy. But if you anticipate growth in data, users or computers, it's worth a look.
The traditional backup medium is tape, but backing up to tape is very expensive, as is storing all of the tapes, rotating them and paying IT staffers to do this. Backing up to disk (via SAN or another method) is ultimately much cheaper than tape, and that's what DPM provides, although DPM can also help archive rarely updated data to tape for record retention purposes. The newer backup method has distinct advantages cost-wise over how things have been done in the past.
What DPM does
DPM can back up and restore data from the following properties via a lightweight agent:
* Virtual Server
* Hyper-V (both the separate product and the Hyper-V role within Windows Server 2008 R2)
* File shares
* The Active Directory system state
* Windows OS clients
The backups can be configured to occur every 15 minutes, with data being transmitted via the agent directly to the DPM server machine. From there, the DPM server can archive up to 512 disk-based snapshots for fast recovery from problems, and also manage record retention on tape-based media through customizable policies. New to DPM with this release is the ability to replicate to other DPM servers for fault tolerance, and the availability of an online cloud backup service from Iron Mountain that can be closely integrated with DPM 2010.
I'm not necessarily against the agent approach, as it affords a few capabilities that would probably not be available without the agent -- such as advanced monitoring and logging, as well as some interesting features, like the ability for laptop users to instantly retrieve incremental backups.
Still, many administrators strongly dislike deploying agents on servers because this approach can slow things down. I did not find the DPM agent to be noticeably detrimental to performance, most likely because the heavy lifting for DPM is done by the Volume Shadow Copy Service, an element already built into the underlying Windows operating system.
I was able to configure policies to back up a variety of machines, including Windows Vista and Windows 7 clients and Windows Server 2008 machines running Exchange Server 2007, in about five minutes using the well-laid-out administration console. While I did not test backing up and restoring SQL Server databases, Microsoft has included a way for SQL Server administrators to retrieve previous versions of any SQL database and restore it to either the original SQL Server machine or an alternate without involving the DPM administrator.
In other words, it provides self-service recovery. Restoring files on clients was very easy, and I was able to recover a system-state backup for a Windows domain controller without any fuss, although it was a time-consuming process. That is likely not a limitation of DPM but a function of the restore process for Active Directory itself.
DPM 2010 introduces protection for what it calls "roaming laptops," those machines that often go for days, weeks or even months without connecting to the corporate network. Often, these laptops are unable to be backed up according to policies set by administrators, which can create a real problem when things go badly on these machines -- whether they're stolen, suffer a hardware failure or otherwise are rendered inoperable.
DPM 2010 allows these machines to be backed up at a very granular level, since the administrator -- and, in some cases, the user -- can define which parts of those machines should be backed up, eliminating the need to constantly back up the entire system, because it can be so easily restored.
In addition, the DPM agent integrates with the local shadow copies feature in Windows Vista and Windows 7. This allows the user to perform a restore himself from local copies if the machine is offline, or from DPM-based copies if the machine happens to be connected to the network. These policies can be centrally managed from the DPM 2010 administrative console.
One of the criticisms of the original version of DPM was that there wasn't a good solution for making the DPM server machines themselves fault-tolerant. After all, a backup of something doesn't do a lot of good if you can't get the backup to restore. In DPM 2010, Microsoft has a simple mechanism to configure fail-over and fail-back among DPM server machines, including supporting a DPM server off-site for improved fault tolerance.
The Microsoft-centric nature of DPM 2010
Microsoft is unapologetic about its position that DPM 2010 is designed first and foremost with Microsoft shops in mind. There isn't a lot of cross-platform support -- indeed, the agent is designed for just Windows machines, including Windows XP, Vista, Windows 7 and a bevy of server products. If you have a mixed-breed data center, you'll want to look elsewhere for comprehensive products.
In a larger, mixed environment, DPM 2010 could work as a backup for your Microsoft world and then help coordinate those backups with a more enterprise-minded archiving package.
The last word
Overall, I get the feeling from using these products in my lab that Microsoft-centric shops will find these tools to be helpful and handy. DPM feels geared to a different market than the Essentials package. Because it's a heavier approach, as I've already said, DPM seems oriented toward a firm with an experienced IT staff rather than the IT generalists that typically are employed by midsize firms.
Also, DPM is a full component of the larger System Center suite, as opposed to Essentials, which is more of a "lite" amalgamation of other System Center products. But I think both are useful to an extent in midsize businesses. Enterprises should investigate DPM at a minimum for their Microsoft installations, since it integrates very well with the existing Windows operating system features and provides many self-service capabilities for users, although it lacks cross-platform support.
Pricing is unavailable at this time, but I expect to see it within a few weeks.
Jonathan Hassell is an author, consultant and speaker on a variety of IT topics. His published works include a variety of books on Windows clients and servers, including Learning Windows Server 2003. His work appears regularly in such periodicals as Windows IT Pro, PC Pro and TechNet Magazine. He also speaks worldwide on topics ranging from networking and security to Windows administration. You can reach Jon at email@example.com.
This story, "Micosoft System Center 2010" was originally published by Computerworld.
Payscale uses alumni post-grad pay to rank 187 colleges and universities with computer science...
Vint Cerf is known as a "father of the Internet," and like any good parent, he worries about his...
How mainstream is big data? We asked two speakers at HP's Big Data Conference 2015 in Boston whether...
Sponsored by SevOne
Sponsored by HP
The U.S. National Telecommunications and Information Administration will host a series of discussions...
Systemic flaws and a rapidly shifting threatscape spell doom for many of today’s trusted security...
Experienced software engineering leaders share what it takes to get the most out of your team.
These are 15 of the highest valued enterprise software companies that have received venture funding but...