* Utility computing requires updates in multiple layers This is part of a series of newsletters based on a presentation I gave at cdXpo in Las Vegas.Utility computing is a work in progress that may be slowed by a number of problems in the way data centers are designed, one analyst says. Jamie Gruener, senior analyst at the Yankee Group, calls them the “sins of today’s data center.”Among the sins:* Servers are less than 50% utilized. * Strategies are not aligned. There is no commonality of management or policies affecting servers, storage and network infrastructure.* IT staffs are overworked and confused by the complexity of managing data centers. * Standards across systems, storage and so forth don’t exist.* It is difficult for IT staff to set service-level agreements in a vacuum of disparate, unrelated systems.For utility computing to work, Gruener says the infrastructure layer – which consists of servers, storage, clients, peripherals and content – will need to be engineered differently to allow better resource allocation.On top of the infrastructure layer is the virtualization layer, where computing and storage resources are pooled and doled out as a service.The management layer sits on top of the virtualization layer. This is where service-level agreements are needed, as well as software that allows IT managers to automate tasks that improve the management process. In this layer is software that allows policy-based automation, service management, provisioning, security metering and billing.The application and business-process layers, which sit on top of the management layer, need to become more service-focused, Gruener says. Utility computing is starting just now. IBM claims it has converted companies to utility computing. So does Sun. Analysts say, however, that this new data center concept will take eight years to develop.Why so long? Gruener says it is due to a number of factors.* There aren’t any universal standards for Web services, policy-based management, and integration of third-party mgmt tools.* Customers will want to roll out parts of their environments in a prototyping approach, not a rip-and-replace data center philosophy. * Pricing needs to be different longer-term, and there needs to be a way to meter use and charge business units directly.* There needs to be incremental pricing. Gruener says this need to “be by the glass, not by the tanker truck.” Related content news EU approves $1.3B in aid for cloud, edge computing New projects focus on areas including open source software to help connect edge services, and application interoperability. By Sascha Brodsky Dec 05, 2023 3 mins Technology Industry Technology Industry Technology Industry brandpost Sponsored by HPE Aruba Networking Bringing the data processing unit (DPU) revolution to your data center By Mark Berly, CTO Data Center Networking, HPE Aruba Networking Dec 04, 2023 4 mins Data Center feature 5 ways to boost server efficiency Right-sizing workloads, upgrading to newer servers, and managing power consumption can help enterprises reach their data center sustainability goals. By Maria Korolov Dec 04, 2023 9 mins Green IT Servers Data Center news Omdia: AI boosts server spending but unit sales still plunge A rush to build AI capacity using expensive coprocessors is jacking up the prices of servers, says research firm Omdia. By Andy Patrizio Dec 04, 2023 4 mins CPUs and Processors Generative AI Data Center Podcasts Videos Resources Events NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe