For some time now, many industry observers - particularly outsourcing companies - have been touting terms such as "grid computing" and\u00a0 "utility computing" as the buzzwords for the next generation of computer services. But as millions of citizens in the northeast and midwestern U.S. found two weeks ago, dependence on an interdependent grid of utilities can occasionally be highly inconvenient, if not downright dangerous.In the case of the great blackout of 2003, regional power companies found themselves at the mercy of power "grids" that are designed to balance the load across power stations and geographic regions and provide redundancy in the case of failure. In this case, the utilities apparently failed to do an adequate job of maintaining the aging relays between the power stations - and between the regional grids themselves - because there was some disagreement and confusion as to who should maintain them.The creation of a utility computing environment - in which many users pull computing power from a common, interconnected infrastructure - has the potential to deliver many of the same efficiencies and cost savings that the electrical power grid does. But it also could put corporations at risk of similar blackouts that surprise its constituents and leave them helpless, much as the citizens of the northeast and midwest were.To create a cost-effective utility computing service, outsourcing providers must build a computing infrastructure that can be efficiently shared across many different enterprises. Unlike current outsourcing environments, where multiple corporations may share the same contractor but are practically given some dedicated servers and other resources, the utility computing customer will be dependent on a common infrastructure of interdependent, shared networks and servers, much as power customers rely on their own power grids.To be fair, this "utility computing grid," even if it is built and maintained only by a single outsourcing service, will likely be designed to be highly secure and reliable. The utility service model enables a provider to focus its resources and attentions entirely on a single infrastructure, rather than assigning teams to work on separate infrastructures to support multiple customers. Built correctly, the utility computing environment would likely provide a higher level of reliability than any enterprise environment, just as the public telephone network is generally more reliable than any corporate network.Nevertheless, enterprises that are considering subscribing to the utility computing model must weigh the possibility that a major outage might cause a complete computing blackout, just as the northeastern and midwestern power companies experienced. This is because the tight integration and interdependency that makes the utility computing infrastructure efficient can also result in cascading failures among interconnected systems.In the utility computing model, many outsourcers envision harnessing multiple computing environments - those they have built themselves and\/or those they inherit from outsourcing customers - to create an infrastructure that can dispense computing power to multiple customers on an as-needed, pay-as-you-use-it basis. But like the power company relays - or the public telephone network's network-to-network interfaces - there must be interfaces that connect these formerly-independent networks and enable them to work together. These interfaces are costly, and there often is confusion as to who should maintain them. Utility computing providers must be careful to address these interfaces early, particularly if an infrastructure is to extend across multiple service providers.The concept of utilities - whether for electrical power or computing power - offers huge potential for lowering costs, increasing efficiency of resource usage, and improving performance. But it also creates the opportunity for major downtime - not just within a single enterprise, but across the entire spectrum of users that depend on it.Just ask any New Yorker.