Is there any doubt that the use of analytics is taking off? According to Dresner Advisory Services, 53% of companies are currently using big data analytics, up from just 17% three years ago. And in a recent book published by the Harvard Business Review Press, Competing on Analytics, the authors document how the use of analytics is determining winners and losers in the market. They further state that analytics-driven enterprises will eventually apply analytics to most of the functional areas of their business, using advanced predictive, prescriptive, and autonomous analytical tools for marketing, supply chain, finance, M&A, operations, R&D, and HR. And the rise of the \u201ccitizen data scientist\u201d foreshadows a trend where virtually every employee will be using analytics daily.\nHowever, for IT organizations, the coming wave of analytics usage will require such substantial IT resources that legacy systems and architectures will be unable to meet the demand. The largest challenges will be centered on infrastructure silos, data storage limitations, the inability to scale up at the speed of business, and performance. We are in uncharted territory in terms of the changing workloads, shortened time to provisioning, and the scale of the datasets being used.\nVirtualization provided an important first step for pooling resources to meet demanding workloads, but broad and deep analytics workloads require more. The clearest and simplest solution to the problem is to move toward hyperconverged infrastructure (HCI). Using HCI solves some of the most challenging problems facing analytic workloads, including:\n\nPerformance: Analytics work is quickly moving from relying on standard reporting and dashboards to employing a more interactive and iterative usage model. Ensuring that performance and response times meet user expectations is essential. Using HCI, IT operations staff will have real-time information on application performance and access to automated tools to ensure that systems will meet SLAs or user expectations.\nScalability: Scaling up or out is an essential capability that HCI brings to analytics. The integration of storage and compute is essential to meeting the twin challenges of supporting ever-larger datasets and meeting performance demands from more comprehensive and frequent analyses being run.\n\n\nEfficiency:\u00a0Legacy infrastructure architectures that have long lead times for adding new resources result in overbuilding infrastructure to provide headroom for future demands. With HCI, you can deploy exactly what is needed today, with the ability to quickly provision new resources as needed. Further, workloads can be moved seamlessly to provide more resources for Tier 1 applications experiencing a transient peak.\nSimplification:\u00a0HCI simplifies the deployment and operation of IT resources, providing a number of benefits for the organization. First is that the cost of running infrastructure will drop significantly since there will be far fewer low-level tasks for the operations team. Next, there will be less need for infrastructure specialists that are focused on either just compute or just storage. With a consistent and simplified console for operations, a general skills set is sufficient. With simpler infrastructure, IT can respond at speed to the demands of business since there are fewer variables to consider.\n\nLenovo partners with the leader in HCI, Nutanix, to offer HCI solutions that will provide the platform for your current and future analytic workloads. In addition, through its strong partnership with SAP HANA, an ideal platform for data management and analytics, Lenovo brings a complete solution to digital business. Coupled with a strong services offering that ranges from design through operational support, Lenovo provides an outstanding option for HCI to support analytics.\nFor more information, visit Lenovo HX Solutions.