What’s your problem? Survey uncovers top sources of IT pain

The struggle is real — find out if your top issues match up with these common problems for enterprise IT admins

computer frustration man resized

It’s hardly surprising that IT professionals have their hands full in the age of IoT (Internet of Things) and Big Data. Supporting rapidly growing data volumes, new data types, and many more data sources is making it harder than ever for IT to meet service level agreements (SLAs) while keeping spending in check. 

The complexity IT manages is clear in the results of a recent Storage Census of over 300 IT professionals my company, Primary Data, conducted at VMworld 2017. The survey showcased the conflicting pressures currently faced by IT leaders. Those surveyed included delivering performance, executing data migrations, meeting expectations with existing budgets, and integrating the cloud into their infrastructure among the biggest challenges facing their departments today. Let’s examine the factors that contribute to these challenges and how IT can solve them.

Performance under pressure

Performance has always been a key issue for IT, but the business demand for performance has never been higher. Today’s enterprises are analyzing more data from more data sources than ever before. At the same time, the ability to perform this analysis and apply insights quickly has become a key source of competitive advantage for leaders—or a disadvantage for laggards. The Storage Census confirms that performance remains IT’s most pressing issue, with 38% of respondents selecting performance as their biggest challenge.

Several technologies target these performance challenges today. Flash memory can deliver the low-latency performance that applications require, but flash-based storage is still too expensive to use for all data in the enterprise. Hybrid systems attempt to address these costs by tiering data between flash and hard disks—but with data volumes growing so quickly, the upfront costs and the cost and disruption of upgrades limits the use of hybrid systems. Distributed systems can enable data to tier across performance and capacity tiers, but they can have manageability, scalability and performance limits that make them unsuitable for many enterprise applications. They can also lock customers into using a single vendor’s storage system, or relegate a tier to only serving archival operations.

A metadata engine gives enterprises flexibility in choosing from some or all of these cost-performance options, as it provides the ability to transparently move data across any type of storage, according to objectives demanded by the business. When used with advanced file systems, such as NFS v4.2, a metadata engine can transparently eliminate the performance constraints inherent in common distributed storage systems. A metadata engine moves the control (metadata) path outside the data path to logically combine different storage resources within a global namespace. It can then distribute data across storage systems to provide applications with parallel data access. Metadata engine software can also intelligently tier data across any type of storage, including flash, shared, and cloud storage. This ensures data is always on the right storage type to meet its performance, cost or protection requirements.

Automating migrations

The architecture of legacy enterprise infrastructure has created storage islands that don’t easily connect with each other, and as a result, enterprise IT professionals spend a lot of time planning and performing storage migrations. Conventional migrations and upgrades can take weeks or months to perform, in large part due to the need to minimize risk to data and the disruption of downtime when migrating storage systems. Even with all this planning, problems still occur. In fact, the Uptime Institute estimates that about 70 percent of data center problems are caused by human error, and no IT professional wants to be a part of that statistic. Knowing this, it’s no wonder that our Storage Census showed revealed that migrations are the second most common headache for IT professionals, named as a top issue for 36% of those surveyed. 

Hybrid storage systems that blend storage media like disk and flash attempt to address this problem but can be costly, and actually just add more storage overprovisioning to the enterprise. The other challenge is that even if an enterprise decided to rip and replace all existing systems with pure flash or hybrid solutions, the IT team would still have to struggle through a conventional migration to make it happen. A metadata engine automates the placement and movement of data—transparently to applications—to virtually eliminate the risk of downtime.

With a metadata engine, admins assign objectives that set the levels of performance, protection and price that data requires. A metadata engine can tier to the lowest cost resource in its global namespace that meets the business requirements. IT can migrate data via a simple policy change and perform hardware refreshes, even during business hours, without impacting business. Importantly, the concept of migration, whether for storage refreshes or application optimization, is replaced by one of continuous, non-disruptive resource optimization.

Slashing data management and infrastructure costs

Cost-cutting has always been a focus for enterprise IT, but as the diversity and volume of data increases, complexity and storage sprawl are straining budgets. Indeed, budget challenges were a top challenge for 35% of IT pros surveyed at VMworld 2017. 

A metadata engine can cut costs in several ways. First, it can automatically and transparently moves warm and cold data off high performance resources to dramatically optimize the use of an organization’s most expensive infrastructure. Second, it enables organizations to dramatically increase the utilization of their existing resources. With global visibility and control, organizations can view data location and alignment against SLAs, as well as available performance and capacity resources. This allows IT to instantly observe resources that might be nearing thresholds, and subscribe to alerts and notifications for these thresholds. It also enables IT to add more of any type of resources within minutes, reducing the amount of over-provisioning enterprises need to protect business.

Advancing cloud adoption

Cloud adoption ranked fourth on the list of enterprise IT challenges, with 27% of respondents listing ranking cloud adoption among their top issues. This is not surprising, as respondents also estimated that at least 60% of their data is cold. In fact, Gartner’s 2017 Cloud Forecast Overview reports that four out of five organizations say they plan to increase spending on cloud adoption this year, with cost savings, agility and modernization key drivers behind those plans.

A metadata engine enables organizations to seamlessly integrate cloud storage with any application, using it as just another storage tier. It can automatically identify cold data, accelerate cloud transfers, enable organizations to snapshots into the cloud, and use the cloud as an active archive by automating movement of data granularly, between cloud targets and on-premises storage.

IT has their hands full delivering performance for data demands protecting uptime, and implementing “cloud first” initiatives, all on a tight budget. Fortunately, there are solutions to meet these seemingly conflicting challenges, and a metadata engine can help IT find balance between delivering performance, avoiding budget breakdowns, and making it possible to easily move data across existing storage and into the cloud.

Copyright © 2017 IDG Communications, Inc.

The 10 most powerful companies in enterprise networking 2022