The concept of "grid computing" was created in the late 1990s by researchers at Argonne National Labs and other places. Like many revolutionary concepts in IT, including the World Wide Web and supercomputers, grid computing emerged from particle physicists' need to push the boundaries of computing.The concept of a grid describes a framework in which heterogeneous and distributed computational, networking, memory and data resources can be linked together to serve the needs of particular user applications.Over the last few years, the concept of grid computing has evolved from a buzzword to a set of concrete standards and implementations. Two groups, the Globus Alliance (a grid that's sponsored by several universities) and the Global Grid Forum (whose members include vendor giants IBM and Microsoft, plus many small research organizations) have become the sources of standards and software solutions for this emerging technology.The Globus Alliance publishes the open source Globus Toolkit, which includes software libraries and services that you can use to create grid applications. The tool kit's libraries address issues such as security, information infrastructure, resource management, networking, reliability and portability.The Global Grid Forum manages the standards process for emerging grid specifications such as the Open Grid Services Architecture (OGSA) and the Open Grid Services Infrastructure (OGSI).The Globus Toolkit and the OGSA\/OGSI standards have enabled the deployment of grid computing solutions in many major research facilities in the U.S. and Europe. There are also popular "consumer" distributed-computing initiatives, including SETI @home, a distributed number-crunching application to find extraterrestrial signals, and distributed.net, a distributed effort to crack various cryptographic keys. It is the popularity of these examples and the initial success of grid computing in the research environment that has led to increased interest in grid technology from industries such as financial services, pharmaceuticals and energy.But how applicable is grid computing in industry given that its roots are in research science? One of the main drivers for grid computing in the research space was to increase the raw computing power available to each institution without the need for new supercomputers. But most companies are not interested in computing of the scale required for global weather simulation. However, IT executives should care about grid computing because it can offer the following:* Lower costs through increased utilization. Instead of leveraging a bigger set of resources to do more computing, IT executives can use the technology to better use their existing infrastructure to cover their current needs.* Improved application performance. For those applications that are compute-intensive, grid computing can deliver an order of magnitude improvement in run-time. The fixed income derivative trading group of financial company Wachovia, for example, reportedly saw a 400% increase in trading volumes following a migration to grid technologies.* Enhanced reliability and redundancy. Running applications on a grid can enable rapid service provisioning from remote or back-up data center in the event of a failure.Even if your industry doesn't have an extreme computing requirement, such as protein-folding or global climate modeling, you may want to explore how grid computing can support your solutions. And the good news? You don't have to wait for large vendors to roll out expensive product suites. The Globus Toolkit is open-source, free of charge and lightweight; so you can download and experiment with it as part of your planning and prototyping exercises.Andreas M. Antonopoulos is principal research analyst at Nemertes Research. Reach him at email@example.comNemertes Research specializes in quantifying the business impact of emerging technologies. For more information, visit http:\/\/www.nemertes.com, write firstname.lastname@example.org , or call 888-241-2685.