Americas

  • United States

Automation: Will I lose control?

Opinion
May 24, 20045 mins
Computers and PeripheralsData Center

Securing our servers is our IT organization’s biggest concern. With new regulations concerning the protection of data and new viruses popping up almost daily, what can I do to ensure that our servers are up-to-date with the latest patches and service packs?

We hear a lot of talk from analysts and big-name companies about the need to automate. As someone who manages a large data center with thousands of servers, I’m concerned that in the rush to automate, I’ll lose control of my server environment. Can you tell me what the benefits are of automating the IT server management process, and how can I guarantee my CIO that by automating the process there won’t be unwanted disruptions in service?

First, your concerns are shared by a lot of other people, including many CIOs and data-center managers I’ve talked with in recent years. And you’re right; the hype around data center automation gets pretty thick.  I believe that automation in the data center is inevitable: just as automation came to manufacturing facilities and to processes as complex as flying jets, automation is making its way into the world of software and hardware.  

Why? It’s because of the advantages it will bring.  Automation – when it’s done right – can mean lower costs, lower error rates, greater reliability, increased stability and availability.

All that being said, I can assure you that any automated process that makes you feel a loss of control is ugly automation.  Good automation, successful automation provides selective control and transparency. 

Let me make a comparison:

Automatic transmission takes care of a lot of processes that were once manual in our cars.  We put the car in gear, and watch data indicated on the tachometer and the speedometer, but in our daily driving, we don’t need to see the gear ratio and RPMs, etc. to operate the vehicle comfortably.  There’s intelligence built into the vehicle that takes care of those things, and we maintain control over certain aspects of the machine, but not all of them. 

Cruise control allows us to set a speed, and the car maintains it automatically.  We don’t see all of the systems involved in maintaining that speed, and perhaps most important of all, control is immediately returned to us when we communicate that we need it, by braking.

In our cars, we trust that automation is doing what we need and we can verify it through selective transparencies (like the speedometer).  The same applies in the data center.  Well-designed automation allows you to verify that applications, processes and systems are doing what you need them to do.  You’ve got costs, and budgets, SLAs, and so on and you must have the ability to view what’s happening.  You can’t afford unplanned disruptions and good automation will ensure that you don’t have them.

It’s important to understand that well-designed automation does not force you to do everything at once: it’s a step-by-step process of automating selected processes based on your comfort level and business priorities.

Being concerned about control is just as natural as the fighter pilot who puts the jet on autopilot for the first time.  But not having autopilot at all can be dangerous.

On the subject of danger, let me warn you about first-generation automation.  It tends to be Rube Goldberg.  There are products out there claiming to be automation that are really just miles and miles of script attempting to duplicate choices and conditions that a human could make in any situation. 

The first automated machines used rods and pulleys to automate the same motions a human being used to accomplish a physical task. The scripts are like these rods and pulleys.  OK, maybe they are a step forward, and a steel rod has been grafted onto a lever.  But the problem in the case of software is that we’re talking about hundreds of thousands of “moving parts”: a server might have certain volume of files of various types on it at one point, and the next moment – because something was written to a database, for example – it might have totally different ones.  The system is dynamic, with constantly changing sets of information.  It’s impossible to script all of the possible events that can occur. 

And worst of all, none of this so-called automation has anything to do with business objectives.  IT automation must be driven by business policy, and the sophisticated, proven approaches do this.

I’ve seen software automation go through its second, third, fourth, and even fifth generations.  What works today – but more important, what will work into the future-is automation that is model-based and policy-driven.  Bringing intelligence to the system is the only real solution to automating something that is constantly changing.  And this intelligence is also the key to control: being able to trust and verify what the system is doing at any given time and what it will do in any situation in the future, based on business policy. 

Fitzgerald is CTO and Director of Product Development of HP Change & Configuration Software Operations (formerly Novadigm, Inc.). Along with Albion Fitzgerald, he co-authored the patent underlying its adaptive management technology. Before joining Novadigm, he served as director of product development at Pansophic Systems and its predecessor company ASI/Telemetrix, where he engineered large-scale, high performance networking software that is used in mission-critical applications by Fortune 500 companies worldwide. He has a solid background in systems management, specifically in the areas of security, networking and operating systems.

Fitzgerald is one of the founders of HP Orchestration, formerly Novadigm, and, with Albion Fitzgerald, co-authored the patent underlying its adaptive management technology. Before joining Novadigm, he served as director of product development at Pansophic Systems and its predecessor company ASI/Telemetrix, where he engineered large-scale, high performance networking software that is used in mission-critical applications by Fortune 500 companies worldwide. He has a solid background in systems management, specifically in the areas of security, networking and operating systems.

More from this author