* The demands of utility computing on the network The greater the complexity of your computing ecosystem, the more likely you’ll derive benefits from a utility model. If enterprise storage for you really means “enterprise,” it is something you should consider.The emerging utility computing models offer many benefits under a variety of circumstances, but utility computing really shines when demands fluctuate. One hundred users kicking off different processes that run on multiple servers and accessing networked storage in a dynamically changing setting are much more likely to get value from utility computing than 1,000 users in a comparatively static environment.A utility model assumes that the infrastructure provisions and manages services and assets on an as-needed basis. “On-demand” is the preferred marketing parlance of course, but I am starting to think that perhaps “as-needed” may be a more suitable term in a managed environment. This is because, in a managed environment, “need” would be determined by policy, not merely demand.It is important to note that in complex environments this is likely to be no trivial task. By way of example, as the amount of networked storage increases, the impact of the networks themselves takes on an increasingly important role in calculating storage I/O. Thus, as most networks are relatively non-deterministic, storage I/O that was once easily determined on a dedicated storage bus becomes harder to predict on a network. Other traffic on a network impacts your I/O demands, and your I/Os impact the network.This of course becomes an even more demanding situation as IT resources are reconfigured to meet changing needs. Even if the new pathways are predetermined the task of bringing them on and offline certainly would be a daunting one. Daunting, but not impossible.Two technologies would seem to be fundamental to the on-the-fly provisioning that is required here: virtualization and automation, both discussed often in this column. But automation and virtualization of what?It is a good bet that to get the most out of a complex system you will have to manage more than just the storage.One simple but useful way to look at IT is to divide it into three broad categories of assets: servers, networking and storage. A completely managed computing utility should be able to virtualize and automatically manage across all three categories, getting the most out of all the components as they operate individually and also as they interact with other systems.Those of you on the East Coast of the U.S. may remember with varying degrees of fondness Ballantine Beer of years past, with its logo of three interlocking rings symbolizing “purity, body and flavor”. Substitute “servers, networks and storage” as names for the three rings, and the place where they all intersect (the “union” for those of you who like set theory) represents the complex interworkings of the three components that make up the IT system. This is what must be managed in order to have a fully functioning utility.History would seem to support this idea: Many managers have found that the thorniest problems to deal with have been the ones whose root cause lies at an intersections between these categories – those places where, for example, storage and networking intersect but also where tools for root cause analysis do not at present extend. Certainly, managing individual components within the system is useful, but give this a bit of thought and you will probably agree that managing a single component or a subset of components will always result in sub-optimized operations when you look at the system as a whole. Managing across two subsets as they interoperate with one another is clearly better than managing only one, but coming to grips with the entire complexity of an IT system is surely the best approach.Alas, it is a given that no single vendor can do it all, and only a few can adequately manage more than one of these three categories. That being the case, the best we can hope for right now is to deal with vendors that can provide us with sub-categories of the utility model – a storage utility, for example, or virtualized server operations that provide an agile approach to using processing power.This won’t address all of the complexity issue of course, and we will still need to ensure the integrity of those points within the system where we have no current ability to manage at present. That is why, when it comes to protecting data, assets and processes, we still have to make sure they operate in as secure an environment as we can provide.In any computing environment, utility or not, data security will always have to be part of the solution. Related content how-to Doing tricks on the Linux command line Linux tricks can make even the more complicated Linux commands easier, more fun and more rewarding. By Sandra Henry-Stocker Dec 08, 2023 5 mins Linux news TSMC bets on AI chips for revival of growth in semiconductor demand Executives at the chip manufacturer are still optimistic about the revenue potential of AI, as Nvidia and its partners say new GPUs have a lead time of up to 52 weeks. By Sam Reynolds Dec 08, 2023 3 mins CPUs and Processors Technology Industry news End of road for VMware’s end-user computing and security units: Broadcom Broadcom is refocusing VMWare on creating private and hybrid cloud environments for large enterprises and divesting its non-core assets. By Sam Reynolds Dec 08, 2023 3 mins Mergers and Acquisitions Industry news analysis IBM cloud service aims to deliver secure, multicloud connectivity IBM Hybrid Cloud Mesh is a multicloud networking service that includes IT discovery, security, monitoring and traffic-engineering capabilities. By Michael Cooney Dec 07, 2023 3 mins Network Security Network Security Network Security Podcasts Videos Resources Events NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe