Performance management from the client's point of view

Client-side performance management can save you from being blindsided by user complaints.

Application performance management vendors are dangling a new carrot in front of network executives for aligning IT with business goals: the user experience. As network executives try to ensure critical applications operate at peak performance, a number of vendors promise to deliver the only measurement that matters.

"We wanted something akin to us sitting next to end users as they logged on to our Web site, seeing what they see in terms of performance," says Steve Weiskircher, CIO at Crutchfield.com, a direct integrated marketer in Charlottesville, Va.

"We needed a level of detail beyond, 'There is a performance problem.' We wanted to see the steps that lead up to it, how the application responded at each stage and what the end user experienced with our application."

Weiskircher got the visibility he wanted by deploying TeaLeaf Technology appliances.

Many views on app performance

TeaLeaf is one among many vendors banging the user-experience drum. Others include Citrix (via its Reflectent Software buy), Compuware, Coradiant, NetQoS, PremiTech, ProactiveNet and Symphoniq. Each has its own take on how to collect client-side performance information.

Citrix and PremiTech deliver management software and distributed agents deployed on client machines to capture application performance data. Compuware, ProactiveNet and Symphoniq rely on monitoring software that collects performance metrics across an infrastructure, without requiring agents on every managed client machine. For instance, Compuware's ClientVantage software (which starts at about $31,500) uses agents installed on standard PCs to act as robots that tap into applications and simulate transactions. IT managers locate the robots in various network segments to act as real users.

Coradiant, NetQoS and TeaLeaf use appliances to monitor traffic and capture metrics, such as response time, while users interact with applications. And synthetic Web-application and site-performance measurement tests from Gomez and Keynote Systems use agents distributed throughout the Internet to determine how a Web site and the applications running on it react to peak loads and various geographies.

The tools typically offer their own management interfaces and reporting features, and many vendors partner with companies such as BMC, CA, HP and IBM to deliver the client perspective to larger management consoles. Some IT managers do the integration on their own. For instance, Weiskircher integrates the data collected by TeaLeaf with "raw server metrics" gathered by Microsoft Operations Manager.

Beyond server response times

Such trends as Web services and service-oriented architectures also are driving the need to collect performance metrics about the client. As more application components reside in multiple servers across an infrastructure - and as virtualization continues to proliferate in data centers - IT managers will find it increasingly difficult to base application performance on server-response times.

"Applications in general are becoming unbundled from infrastructure in this trend toward the desegregation of IT," says George Hamilton, director of enterprise computing and networking at Yankee Group. "IT managers can no longer infer application performance from infrastructure metrics."

In Weiskircher's case, TeaLeaf software is loaded onto a dedicated appliance that connects to a span or mirror port on a switch observing HTTP and HTTPS traffic. The software, which costs about $80,000 for an average deployment, monitors user sessions with a Web site and its applications. It generates alerts based on specified events, correlates captured data and provides detailed reporting. A viewer component replays the user sessions, letting IT groups reproduce problems. Using the viewer, IT managers can see what the user saw, TeaLeaf says.

"Prior to TeaLeaf, we relied on customer logging, which required scraping log files," Weiskircher says. "But with that approach, we had to know what we were looking for. We always had too much or not enough information."

Using the data collected by TeaLeaf, Weiskircher says his team is able to fix the application code that causes performance problems on the Web site, though he declined to provide more specific details. "For us as a retailer, we know no one in their right mind would allow someone to abandon their purchase because of a problem during checkout in an actual store," he says.

App performance, piece by piece

Sid Siegel, director of corporate application development at Milliman, an actuarial consulting firm in Seattle, uses Symphoniq's TrueView software (licenses start at about $10,000) to get an idea of where performance breaks down on the application-delivery path. "We want to understand as best we can from the end-user perspective what kind of performance they are getting and then be able to break it down to the pieces of infrastructure that support the application," Siegel says. "Our purpose is to make sure these people can do their jobs under the best possible conditions."

TrueView software uses a sort of tracking device to measure application performance along the path between client and server. That means when a client requests the application and it is sent back from the server, TrueView tracks it through to the browser and back to create a picture of end-to-end latency, error conditions and throughput, the vendor says. The result is a picture of overall performance that, for instance, isolates problems on a database server from problems on a Web site.

Kamal Jain, BrassRing

Kamal Jain is director of ASP operations at BrassRing in Waltham, Mass. The company, which provides talent management and recruiting solutions, depends on Coradiant's TrueSight appliance to help determine whether a user-performance problem is caused by the application or the network. The network appliance, which costs $20,000 to $90,000, captures data via a network tap, a mirrored port on a switch or similar feature on a load-balancing device.

"We are Web-based, so I can't guarantee all the stuff in the middle, and I can't put agents on all the clients, and for security reasons, I can't see all the data," Jain says. "We didn't have a solid idea about our client performance except for anecdotal complaints. Now we can see user sessions, tweak application traffic and know ahead of time where the problem lies and who is really responsible."

Scott Grinna, director of IT services at West Bend Mutual Insurance in West Bend, Wis., uses Compuware's ClientVantage software (priced at $31,500) to build a profile of how applications perform. He runs ClientVantage agents on a standard PC build, and those workstations act as any other user might, he says. He locates a half dozen of those across the infrastructure to monitor latency, response time and other performance metrics in different network segments, and integrates ClientVantage with his HP OpenView management software. The biggest benefit the software provides is the ability to map the client-performance problem back through the infrastructure to a specific event or time.

"Unfortunately, most problems are due to an unauthorized or undocumented change," Grinna says. "The faster I can relate that the end-user performance problem is due to a specific change, the better."

Tool drawbacks

Grinna says the software is just part of a bigger initiative for him at West Bend Mutual. He is working toward integrating IT management with business processes and admits he'll need more from the technology going forward.

"We envision our organization as a series of business processes, and we ultimately want to line up our IT management with those processes," he says. "The technology will need to more intelligently measure the transactions that make up those processes from various perspectives, including the end user."

Despite customer successes with various technologies, industry watchers concur that to date, no single approach can deliver all the relevant experience data IT managers need to improve performance across applications for a majority of users. The types of applications vary widely, as do the types of users.

"There is no one tool that can tell you everything, and there are drawbacks to each approach," Yankee Group's Hamilton says.

For instance, products that require agents installed on client machines to capture performance data - Citrix's EdgeSight ($70 per managed user) and PremiTech's Performance Guard (about $100 per seat, plus server license) - don't work for Web-based applications, because IT managers can't install agents on machines outside of their infrastructure. That means Web-based application performance on all clients is not measurable.

To tap Web-application performance, measurement tools often use appliances or software probes, attached to a mirror port on an edge router, for passive monitoring of all client and data-center interactions. Yet the products collect so much data that pinpointing the source of a problem in real time can be difficult. And those techniques that use synthetic testing to determine performance get metrics on a sample of transactions and can't measure real-world scenarios.

"The complexity of applications has risen to such a degree that IT managers need to evaluate how they want to approach managing them and which tiers are critical for them to monitor," says Stephen Elliot, a senior analyst with IDC. "IT managers need to identify their key applications and determine from there if client-side monitoring is appropriate, or if another method could work for them."


< Previous story: Linux virtualization heats up | Next story: FAQ on NAC >

Learn more about this topic

Achieving optimum application performance

6/26/06

Perfecting app performance

6/26/06

Measuring performance via users

6/26/06

User experience is key

3/13/06

App performance from an user's view

8/8/05

Insider Shootout: Best security tools for small business
Join the discussion
Be the first to comment on this article. Our Commenting Policies