Filling the app gap

Tech entrepreneur Bill Coleman discusses the promise of automated application management.

If the new data center is to become intelligently self-healing and unquestionably reliable, trustworthy automation technologies had better begin to materialize. Enter start-up Cassatt, which aims to fill in the automation gap with products that perform real-time provisioning of IT resources. Over time, such automation will nudge hardware toward commoditization, which will lower the data center's total cost of ownership, predicts Bill Coleman, CEO of Cassatt, one of Network World's 10 start-ups to watch for 2004. Here Coleman shares his IT automation vision.

Your marketing materials say that Cassatt's flagship product, Collage, creates a virtual pool of servers. How is this different from what server virtualization vendors, such as VMware, do?

Collage is automating IT operations. We happen to use, as one aspect, the ability to virtualize how the application and data are profiled across the network. That is different from what VMware or [Microsoft]Virtual Server or XenWorks do. We can provide run-time automation to any application running on any system, including a virtual machine such as VMware. The other thing is if a specific server fails, VMware might not actually know that it is running on several physical servers simultaneously. But we can automate its deployment, and we can automate its failover. We can automate its scalability. We can automate attachment management and version management - on the fly.

How does your technology differ from grid computing, where CPUs can be used across servers as if they were a single pool?

Grid computing, virtualization and utility computing are all way overloaded terms whose definitions are not detailed enough to say what they actually are. Our focus is limited to one thing only: optimizing how we profile what you're doing with your applications and data based on the physical resources available. You can call that a grid. You could actually, at some point [when you figure out how to measure and meter the capacity you're selling and bill it], call it a utility.

As we get closer and closer to a world in which you can consider the hardware components separately fungible, then we will have the reality. Something like us has to sit there and not require any changes to the software and not require any changes to the hardware but dynamically be able to reconfigure how the software and data is profiling and using the physical world.

Does this type of on-demand automation ease business continuity and disaster-recovery processes? If so, in what ways?

We virtualize the connection between the physical world and the logical world. There is something that is needed beyond provisioning, beyond virtual machines, beyond system management to, in run-time, be able to [adjust to changing conditions] without people having to physically take down servers and rebuild them and put another server up. We don't care what breaks in the hardware. We're not trying to manage the hardware. All we're trying to do is take a set of policies - and those policies can be utilization, they can be failure, they can be time of month - and change how we're re-profiling the application data flow to maximize it. Automating run-time operations means a failover site or business continuity is a byproduct - it just happens.

What are your thoughts about the direction of IT generally? When IT reaches the Holy Grail of the fully automated data center running on inexpensive hardware, then what?

What we're going through is the commoditization of computing, both on the hardware side [call it grid, call it whatever] and also on the application side. It's going to take a while, but the role of IT automation is to enable it. We're going to get to a point very quickly in which if we don't automate at least a part of IT operations, we will not be able to manage the scale. Without IT automation, at some level, it won't happen. This is the fundamental missing piece in the distributed computing paradigm today.

Once we enable IT automation on commodity computing, we dramatically drive down the hardware cost and the operation and maintenance costs. It's going to take several years, but as that happens, it will accelerate outsourcing of the running of the back end of the utility by every corporation in the world because it not only drives down the capital expenditure cost, but it also drives down the operating expenditure cost.

There will no longer be, 10 years from now, high-priced switches, high-priced storage and high-priced servers. They will be so commoditized that companies that are in those businesses today will have to find ways to increase their value proposition - and they will; IBM has done it before.IT automation will take the economics out of the outsourcing industry as we know it today, given the focus on backend operations at very high cost. But it will also disintegrate the application stacks that we know, so it will change the software world. Once that happens - and we're talking about 10 years from now - then the economics of everybody that uses computers will be affected. ... We are finally moving toward a model of computing in which it isn't about the technology, it's about the information and the business model.

Schaibly is a freelance writer in Fort Collins, Colo. She can be reached at

Copyright © 2005 IDG Communications, Inc.

The 10 most powerful companies in enterprise networking 2022