Storage vendors pitch new systems in innumerable ways. Whether they tout performance claims about IOPS and low latency, protection, reliability, and security features or sell on convenience, capacity, cost, or even brand reputation, there are many options vendors can offer an IT team looking to fix a problem.
Although these various abilities have been around for many years, they have long been confined to a storage-centric ecosystem. With the advent of advanced data management software, it finally becomes possible to shift to a data-centric architecture that enables IT admins to automatically align data with storage that meets enterprises’ business objectives.
What gets measured gets managed
The roots of the proverb “what gets measured gets managed” can interestingly be traced back to Lord Kelvin, father of the laws of thermodynamics and physics. This wisdom holds true today in the modern data center. From flash, cloud and shared storage, the various resources that architects choose for their infrastructure generally offer quantifiable amounts of performance, capacity, and even networking and are often purchased and managed according to the the boundaries of the specific product.
Traditionally, since the pain of data migrations means a storage resource will serve an application throughout its lifetime, IT is forced to choose a resource that can meet expected peak performance and capacity needs. This one-size-fits-all approach keeps applications safe, but it results in significant waste at most enterprises, as well as misalignment between storage capabilities and actual data demands. Hot data that becomes cold takes up valuable capacity on performance storage, forcing IT to keep spending their way out of performance problems.
This inefficiency points to the limitations of the enterprise technology legacy we’ve inherited from generations past. Since the early days of enterprise computing, there has been an architecturally rigid relationship between applications and storage, which is the root cause of the inefficiency in storage today. But it doesn’t have to be this way any longer.
Making omelets to automate storage efficiency
Similar to how you have to break a few eggs to make an omelet, getting past this inefficiency requires some separation and some new connections. By separating the metadata path from the data path through virtualization, it is now possible to attach different types of storage into a global namespace. A metadata engine manages these two paths separately, and it can understand the attributes of the different types of storage in the system. With this knowledge and ability, a metadata engine can then automatically move data to the storage resource that meets IT’s defined objectives for data down to the file level.
By assigning objectives to data, defining attributes such as how many IOPS or how much redundancy is needed (at a file level), or the maximum storage cost a dataset can consume, data management software can automatically match data to a corresponding storage resource.
More ease, less effort
Once IT has the ability to manage data by objectives, IT can easily set requirements for data that ensure service-level objectives are continually met. Objectives can then be changed in a few clicks, with data automatically moving, if needed, to maintain service-level compliance when requirements change. IT can even offer a catalog of objectives to users, who can then pay accordingly for Platinum, Gold, Silver or Bronze level services, as an example.
Objective-based management also makes it easy to ensure newly added resources are used most effectively. When a new storage resource is added to the global namespace, files that meet the criteria of the newly added resource can automatically be load-balanced to the new system, whether it is on premises or in the cloud. This intelligence finally takes the pain out of data migrations and upgrades, and it even extends the life of existing storage systems that can continue to serve data that doesn’t need the fastest, newest, highest performance flash system.
As data continues to proliferate at every enterprise, objectives deliver unprecedented agility for IT. Automation and efficiency are critical for keeping IT a step ahead of the next application challenge, and objective-based management is rapidly becoming a key asset for petabyte-scale enterprises.