Traditionally associated with scientific and technical applications, grid computing is making its first forays into corporate networks as a way to increase utilization of existing corporate systems and networks.Traditionally associated with scientific and technical applications, grid computing is making its first forays into corporate networks as a way to increase utilization of existing corporate systems and networks.Several software vendors are shipping general-purpose applications of grids such as videostreaming, large file transmission and shared data access. Meanwhile, a growing number of companies – in industries such as pharmaceuticals, electronics, auto manufacturing, energy and financial services – are piloting grid projects across their backbone networks.These trends point to a growing demand for grid computing on corporate networks during 2003. “The technology to hook up fast computers to attack a compute-intensive problem has been deployed in academic circles for years,” says Alex Linden, research director for emerging trends and technology at Gartner. “The current surge in interest in grid computing is due to the availability of thousands-of-GHz PCs and Fast Ethernet connections to string them together. . . . One could argue that the average company has a few terahertz of computing power idling during the nighttime.”In grid computing, a compute-intensive or data-intensive application is processed by many distributed computer systems connected via a LAN or WAN. Today’s grids range in size from dozens to hundreds of individual systems, which can be PCs, Unix workstations or servers. Most corporations deploy grids on their private IP networks rather than the Internet. For years, scientists have used grids to solve complex problems in areas such as forecasting weather, modeling nuclear explosions, sequencing genes and analyzing seismic data. What’s new is that grids are being used for more practical business problems, including risk analysis, digital content creation and data mining.“There are two big benefits to deploying a grid,” says Peter Jeffcock, group marketing manager for grid computing at Sun. “The first one is obvious, and it is to dramatically increase utilization of systems -compute, network and storage – that you’ve already got. Typically, systems run at 10% or 20% utilization. With a grid, that can be at 90% utilization. The other benefit is to be able to use more compute power to attack a more challenging problem or to solve a problem quicker.”Companies traditionally have used open source middleware such as the Globus Toolkit 2.2 or the Sun One Grid Engine to manage grid applications, schedule network resources and track system utilization. The free Sun One Grid Engine has been deployed on more than 6,500 grids during the last two years, Sun officials say.But now vendors including Sun and several start-ups are offering enterprise versions of their grid middleware. Sun last year began shipping an enterprise edition of its Sun One Grid Engine that handles policy setting and scheduling resources. The enterprise edition has attracted customers such as Ford Motor, which uses a grid in the design of its automotive powertrain, and Sun, which has 7,000 systems connected via a grid for semiconductor design.Similarly, other grid vendors are updating their software with enterprise features such as support for Java, Java 2 Platform Enterprise Edition (J2EE), .Net and other standards, including Lightweight Directory Access Protocol (LDAP), SNMP and security protocols. Vendors say these features will make it easier for corporate IT departments to integrate grid applications into network infrastructures.In December, start-up Avaki upgraded its Data Grid software with several features designed to attract corporate IT buyers. A dozen pharmaceutical companies use Avaki Data Grid software to provide secure access to large amounts of data stored across distributed systems. “Pharmaceutical companies have research teams spread around the world, and they need to ensure that the data being used by all their teams is current and consistent,” says Tim Yeaton, president of Avaki. “With our software, the data is grid-enabled. . . . The alternatives are using FTP, which is complex and expensive to deploy, or setting up a separate Web site.”Avaki Data Grid 3.0 is implemented in Java and J2EE and can take advantage of external directory services for user authentication via LDAP. It also supports SNMP, and a company’s existing network management software can manage it. In addition, it has built-in failover support for increased reliability.While Avaki’s Data Grid software provisions large amounts of data to a grid, the Avaki Compute Grid handles scheduling for compute-intensive applications across a grid.Also in December, Kontiki added a grid component to its suite of software, which provides managed delivery of video and other large files across an enterprise network. Kontiki’s Grid Delivery Server can be added to its Delivery Management System 2.0 to create a grid out of the existing servers in the network for file delivery. “It provides more control over delivery parameters, and it can increase network efficiency by as much as 25 times,” says Mark Szelenyi, director of enterprise marketing at Kontiki.Kontiki’s new grid software supports a company’s existing directory infrastructure and security mechanisms. Any server in the grid can deliver the video content or large file to a user on demand, with the software determining the most efficient way to deliver the file over the network. The grid approach is particularly useful for remote offices with smaller bandwidth connections because it lets a local server provide the video file rather than tying up the WAN connection to corporate headquarters.Meanwhile, start-up DataSynapse will ship this quarter Version 3.1 of its LiveCluster software, which will include support for Java, .Net, J2EE and Web services standards. Several financial services firms, including Bank of America and Abbey National Group, use LiveCluster to run compute-intensive risk analysis and pricing programs over grids.Also scheduled to ship this quarter is a new offering from grid pioneer Platform, which has offered a suite of workload, service and performance management tools for distributed computing environments since 1992. Platform’s new Symphony product was designed for enterprise grids, and it already has attracted JP Morgan Chase as a flagship customer. Symphony supports standards such as Java and J2EE, and Web services standards such as XML and Simple Object Access Protocol.On the horizon is a new set of standards called the Open Grid Services Architecture (OGSA) that will make it easier for companies to roll out grid applications that work across heterogeneous networks and the Web. The Global Grid Forum, whose hundreds of members include Sun and IBM, is developing OGSA. The group plans to release the final OGSA documents next month.Grid vendors anticipate a flood of product announcements this spring, including Web server, operating system and network management software with OGSA support. The Globus Project this month will release the beta version of its Globus Toolkit 3.0, which includes support for OGSA. A final version of Globus Toolkit 3.0 is due out this spring.Grid proponents say that widespread adoption of OGSA standards is critical for enterprise usage of grids. Sun and IBM plan to support OGSA in their grid offerings during 2003. IBM plans to introduce OGSA support in its Tivoli, DB2, WebSphere and Storage Tank software this year. In addition, IBM will ship OGSA support in all of its operating systems, including AIX and Linux.“In 2003, we see that the financial industry, governments, life sciences, higher ed and the industrial sector will be hot areas for grids,” says Dan Powers, vice president of grid strategy at IBM. “Towards the second half of 2003, when the standards come out and products include them, that’s when we’ll see more general-purpose uses of grids.” Related content news EU approves $1.3B in aid for cloud, edge computing New projects focus on areas including open source software to help connect edge services, and application interoperability. By Sascha Brodsky Dec 05, 2023 3 mins Technology Industry Technology Industry Technology Industry brandpost Sponsored by HPE Aruba Networking Bringing the data processing unit (DPU) revolution to your data center By Mark Berly, CTO Data Center Networking, HPE Aruba Networking Dec 04, 2023 4 mins Data Center feature 5 ways to boost server efficiency Right-sizing workloads, upgrading to newer servers, and managing power consumption can help enterprises reach their data center sustainability goals. By Maria Korolov Dec 04, 2023 9 mins Green IT Green IT Green IT news Omdia: AI boosts server spending but unit sales still plunge A rush to build AI capacity using expensive coprocessors is jacking up the prices of servers, says research firm Omdia. By Andy Patrizio Dec 04, 2023 4 mins CPUs and Processors Generative AI Data Center Podcasts Videos Resources Events NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe