• United States

5 reasons why IT can’t tame the user experience for the network manager

Nov 15, 20175 mins
Network Management SoftwareNetworking

Why today's user experience for network professionals sucks and why it has become so hard for IT staff to tame.

wifi tools
Credit: Thinkstock

Every vendor today is spewing about the importance of managing the user experience. What this actually means, however, remains a mystery to most, and there are precious few approaches available to help you get a handle on the issue.

Good and predictable user experience is no longer negotiable in this age of constant online business communications. Computer networks have effectively become the single most important tool driving corporate productivity.

But user experience is one of the most difficult problems to address, especially on enterprise access networks, because each experience is influenced by a long list of moving parts, many of which are increasingly outside the control of IT. 

For network professionals, five main reasons drive why today’s user experience sucks and why it has become so hard for IT staff to tame:

  1. Too much data to analyze and correlate across the stack.
  2. Too many disparate management and monitoring tools.
  3. More mobile users armed with smart devices accessing services not under IT’s control.
  4. No single source of network truth that can be shared among the different IT factions.
  5. Lack of a holistic, end-to-end, view of the client experience.

When a user connects to an enterprise network, a series of transactions kick off, each of which can impact the experience, such as connecting to Wi-Fi, authenticating to the network, obtaining an IP address, resolving a URL, getting routed to the appropriate domain, contending for access to an application on a server, the health of the app, etc., etc.

To really understand how the network is behaving from the user’s point of view, volumes of data for each transaction must be correlated and analyzed. The problem is, IT network folks aren’t paid to be data scientists and can’t sit around staring at data 24 hours a day.

Just scouring logs from a single server to figure out why one user can’t authenticate to the network can require an inordinate amount of time, and forget it if the problem involves hundreds or thousands of users.

Oh, and mobility only makes things worse. Smart mobile devices outside IT’s control simply have too many different operating systems to keep track of, each of which behave differently with different parts of the network.

To get by, network managers have been forced to use a mix of discrete vendor and homegrown tools, each of which provide some sort of view into specific parts of the network. But this can actually raise more questions than deliver answers when it comes to finding and fixing problems impacting clients. Is it the Wi-Fi network? Is it ARP? Is it DNS, DHCP, AAA or some application problem? Network managers shouldn’t have to use 10 different tools to figure it all out.

What’s more, the tools a WLAN engineer might use to troubleshoot individual or systemic client issues are most likely much different from the tools the applications or systems admin might use, none of which tell the entire story. Good user experience is predicated on successful transactions up and down the entire network stack.

So what’s next?

Network managers are really looking for is a heterogeneous platform that can be used to provide specific insights into the user experience across the wired and wireless network, the wide area network and application services. They need tools that automate the learning process of how the myriad user devices actually behave with every part of the network and deliver a single source of truth that examines and quantifies every aspect of the client experience. But that is easier said than done.

Suppliers of infrastructure components will tell you their solutions address user experience, but be cynical. They don’t. They might provide some raw data or pretty charts concerning one slivered part of the network, but that’s about it. Since the user experience depends on so many different client transactions, network professionals need to see it all, understand it all with some level of network-wide context. Today that’s just not possible without going broke or crazy.

Instead, new software approaches are emerging that look promising. Far from perfect, these tools leverage recent advances in machine learning and big data analytics and marry them with cloud computing.

Wired and wireless traffic is typically siphoned off the network and delivered to a localized engine that crunches the data, looking for problematic trends and patterns impacting user connectivity.

While these user performance management tools vary from vendor to vendor, the good news is they aren’t tied to a specific infrastructure vendor. And, what’s more, they typically measure and analyze every client network transaction to learn how the network, services and applications are behaving with all client devices, providing a holistic view that exposes issues in the access network.

If there’s a Wi-Fi coverage problem in a specific location, a DNS connectivity issue for a given group of clients or an application response time problem, these solutions will see it and flag it for remediation. 

This represents a huge win for IT because these types of problems typically account for a large percentage of the issues impacting user experience.

And while there is no silver bullet when it comes to improving user experience, the emergence of these new tools and technologies are providing a way to gain unparalleled visibility into exactly what’s going on with users on your network, wherever they are, whatever they’re doing. 

by GT Hill

GT Hill is currently the Director of Product and Technical Marketing at Nyansa. He was formerly the Director of Technical Marketing at Ruckus Wireless. He has been working with Wi-Fi since 2002 when he started a Wireless ISP covering over 1000 square miles in rural Oregon. Since that time he became Certified Wireless Networking Experts (CWNE) No. 21, has been an independent consultant, and worked for various technology vendors. He currently resides in Arkansas on his decommissioned Titan II Nuclear Missile Base.

GT’s extensive understanding of computer networking includes Wi-Fi protocol behavior, network architecture and specialized topics such as dynamic beamforming, 802.11n and RF interference. He has successfully designed Wi-Fi networks in various environments, and trained company personnel to maintain and troubleshoot the network. GT has also implemented many successful Wi-Fi networks in varying environments from State Capitol buildings to covering 1000 square miles of the high desert for remote Internet access.

GT’s strength lies in his ability to take increasingly complex and technical topics and successfully communicate their value and operation in simple or deep detailed terms.

The opinions expressed in this blog are those of GT Hill and do not necessarily represent those of IDG Communications, Inc., its parent, subsidiary or affiliated companies.