Americas

  • United States

How to shrink your data-migration schedule

Opinion
Oct 06, 20174 mins
Data Management

Migrate data faster, then put manual migrations behind you forever.

migration migrate birds
Credit: Thinkstock

If there’s one problem just about every IT professional can relate to, it is the pain of a storage migration. Aging is part of life not only for us IT veterans, but also the storage systems we manage. Despite the fact that we’ve been having to move data off old storage for decades, the challenge of moving data from one storage resource to another, without disrupting business, remains one of the most time consuming and stressful projects for an IT team.

Many of the IT professionals I speak with tell me that their migrations are scheduled over months, and can even take a year to plan and execute. It’s no surprise then that IT professionals named migrations as the number two issue facing their departments in a recent survey. Only performance presents a bigger challenge for today’s IT professionals.

Migrations are not performed only when storage hardware is retired, they are also performed to clear inactive data off still usable hardware. This task is typically not performed as often as it could be since it can be hard to figure out what data is inactive data (and where the inactive data is located) so it can be archived or moved. This equates to significant overspending on storage capacity. As a business matures, IT manages more and more storage over time, creating significant complexity and risk. If a long-time IT employee leaves, it’s not uncommon for teams to struggle with determining the history of the data and storage infrastructure they were managing.

Migration pain also creates challenges for IT teams who are now eager to adopt the cloud for agility and savings. The savings the cloud makes possible means that it’s best used for archival today, but again, there is the problem of knowing what data is cold and when that data can be safely archived without affecting applications. The breadth and diversity of storage resources found in most enterprises today severely limits visibility into data activity across the storage ecosystems.

How to fix the problem

The problem at the root of migrations is the fact that storage systems have long been islands isolated unto themselves. Fixing the pain of storage migrations starts with an architecture shift to make these different storage resources simultaneously available to applications.

Data virtualization abstracts the logical view of data from where it physically resides, making it possible to create a global namespace that pools previously isolated storage resources.  With software that analyzes metadata – the data about your data – IT can finally see which files are active, which are sitting idle, and when they were last opened or edited. Insight into the costs of each storage tier can then help IT set policies and determine which resource they need to fit business needs, whether those requirements are to maximize application performance, minimize costs, or somewhere in the middle.

Once IT has set their operating requirements for their datasets, a metadata engine can automatically ensure those policies are met. When a threshold for performance or costs is triggered, data can be moved without application interruption, automatically placing files to the best storage resource to meet business needs.

Serious savings

Beyond the ease of automated data mobility, one of the great benefits of integrating different storage resources into a global namespace is that it can give existing resources a second life. Once the pain of migrations becomes obsolete, older storage can become secondary or intermediate archival tiers rather than being retired. With the majority of IT professionals estimating that 60% or more of their data is inactive, repurposing older systems to serve as secondary or intermediate archival resources gives IT much more capacity to put to use instead of just spending on a bulk purchase of high performance storage. In addition, intelligent data management software can give IT real data to show executives what investments are really required to enable business success.

Shorter migration schedules

Data management has long been a one-size fits most practice. With data virtualization and intelligent metadata management, IT can customize how to balance the conflicting expectations to deliver performance, while keeping budget in check by making the most of their existing storage resources. Not only do migration schedules get shorter, manual migrations can actually become antiquated as automated data mobility helps IT respond in real time to changing application needs with more agility than ever before.

lancesmith

Primary Data CEO Lance Smith is a strategic industry visionary who has architected and executed growth strategies for disruptive technologies throughout his career. Prior to Primary Data, Lance served as Senior Vice President and General Manager of SanDisk Corporation IO Memory Solutions, following the SanDisk acquisition of Fusion-io in 2014. He served as Chief Operating Officer of Fusion-io from April 2010.

Lance has held senior executive positions of companies that transacted for billions of dollars, holds patents in microprocessor bus architectures, and received a Bachelor of Science degree in Electrical Engineering from the Santa Clara University.

The opinions expressed in this blog are those of Lance L. Smith and do not necessarily represent those of IDG Communications, Inc., its parent, subsidiary or affiliated companies.