• United States

Can Your Infrastructure Meet the Challenge of Moving from Analytics 1.0 to Analytics 2.0?

Feb 04, 20194 mins

istock 918508394
Credit: istock/MicroStockHub

Nearly every business is moving to become analytics-driven as more and more of them seek to support their digital transformations. According to , 53% of companies were using analytics as of the fourth quarter of 2017, up from 17% in 2015. Explosive growth like that makes it clear that most organizations have only just begun to utilize analytics. And there’s every indication that the use of big data analytics is going to snowball, with it soon being employed by every department and innumerable individual employees. Call it the shift from Analytics 1.0 to Analytics 2.0

One problem for IT is that analytics is a highly demanding workload. As the volume of analytics workloads increases exponentially, existing legacy IT infrastructure can’t keep up. And it’s not just the compute requirements that are impacted; the need for huge datasets to inform the analytics creates new demands on storage infrastructure as well.

As long as the use of analytics has been limited in the Analytics 1.0 phase, the workloads could usually be supported without new infrastructure. Those legacy systems could handle pilot projects and early analytics applications, and in most organizations the workloads of a few data scientists and well-trained analysts weren’t overwhelming. But as analytics gets more firmly entrenched in more organizations, legacy infrastructure has limited the number of jobs that can be run, and the time and resources necessary to integrate disparate data for analytics work have reduced the number of analyses that can be undertaken. In fact, it isn’t unusual in some organizations for data preparation to be so onerous that actual analyses take a back seat.

It’s a state of affairs that will have to change when Analytics 2.0 hits.

The coming tidal wave of analytics and big data workloads will be ushered in by several new technologies. The two most important are:

  • “Analytics for everyone” software tools. The big trend in analytics is typically referred to as empowering the citizen data scientist. What this really means is that the user base for analytics software is expanding across the organization to include most employees. This will spike the number of analyses being done; also, many of the users running them will lack knowledge and training and will create projects that are not optimized in any way, shape, or form. This will only add to the demand on IT infrastructure.
  • Data ponds, lakes, and oceans. Analytics-driven companies want to use all their internal data and as much external data as possible to inform their analyses. This means much larger datasets to store and crunch. Data growth is now being measured logarithmically. And when we add in billions of IoT and other sensors providing real-time data streams, the data growth is almost unimaginable. A data explosion is not too strong a term.

Successfully meeting this demand is critical. And many organizations are moving toward hyperconverged infrastructure (HCI) to meet the challenge. This modern architectural approach delivers a highly scalable infrastructure solution that can grow in hours, not weeks. Further, the tight coupling of compute and storage resources provides the higher performance levels necessary for Analytics 2.0. And with simplified management, adding HCI will not increase demand on scarce IT resources. In fact, HCI solutions require less management time and attention than legacy systems.

Lenovo is a leader in providing HCI solutions for analytics workloads. The company is a deep technical partner with SAP HANA, a leading platform for data management and analytics. Lenovo also has a dedicated center of expertise for HANA. In addition, Lenovo works with HCI leader Nutanix to deliver pre-integrated, tested, and certified platforms that you can stand up quickly.

For more information on Lenovo’s industry-leading HCI solutions for analytics, please visit our site