Americas

  • United States
sandra_gittlen
Contributing Writer

First steps in consolidating your data center

Opinion
Sep 17, 20033 mins
Enterprise Applications

* How to deliver faster, better, cheaper IT

As we’ve talked about in this newsletter, there is a tremendous amount of buzz surrounding the topic of data centers. And it’s not just from vendors, either. By the response we received to the kickoff of our New Data Center Technology Tour, it’s clear this issue is firmly in the hands of IT organizations.

IT managers from a cross-section of industries say they are beginning to consolidate their distributed and varied data centers into fewer, better managed data centers. They say the benefits in savings and personnel alone are good enough reasons. But the gains in network management, higher efficiency and the ability to roll out new applications quickly are bonuses.

Keynoter Johna Till Johnson, president and chief research officer at Nemertes Research, points out that overall data centers can deliver the “faster, better, cheaper” IT organizations are looking for, and specific operational and architectural trends and technologies can be mapped to each area.

For instance, if the goal is to reduce operational costs, then an IT manager would look toward IT centralization, data center consolidation, resource virtualization and centralized management. She says technologies that would achieve this goal are grid computing, blade computing, open source operating systems and applications and storage networking.

If the goal is to improve reliability and performance, then the trend should be toward so-called “lights-out” or utility computing, disaster recover and business continuity process architectures, and virtualized storage and computing. The technologies you would apply in this instance are clustering and high-availability computing, grid computing, application quality management, SSL VPNs and identity management.

Finally, if the goal is to deliver services faster, then you want to focus on common application frameworks and standardized data classification. To help in this matter, you’ll want to think about rolling out Web services and creating a data classification and data quality management system.

Once you’ve decided your main motivation and what approach you’re going to take (and you aren’t limited to just one), then Johnson says you have to create a data profile model.

A data profile model defines security, restoration priorities, fault tolerance requirements, impact of downtime and design criteria among other issues. This is critical, Johnson says, because the results of your profile will stand as the foundation for your infrastructure requirements, security strategies, service-level agreements, business continuity, disaster recovery and back-up and restore policies.

These are just some of the points we discussed at the event. I encourage you, if you are located near Chicago, Washington, D.C. or New York, to join us at the upcoming events. I guarantee you’ll walk away with invaluable information to help you create your own data center strategy.

Register at https://www.nwfusion.com/events/datacenter/index.html