Data centers are becoming more complex and still run the majority of workloads despite the promises of simplicity of deployment through automation and hyperconverged infrastructure (HCI), not to mention how the cloud was supposed to take over workloads.\nThat\u2019s the finding of the Uptime Institute's latest annual global data center survey (registration required). The majority of IT loads still run on enterprise data centers even in the face of cloud adoption, putting pressure on administrators to have to manage workloads across the hybrid infrastructure.\n\nWith workloads like artificial intelligence (AI) and machine language coming to the forefront, that means facilities face greater power and cooling challenges, since AI is extremely processor-intensive. That puts strain on data-center administrators and power and cooling vendors alike to keep up with the growth in demand.\nOn top of it all, everyone is struggling to get enough staff with the right skills.\nOutages, staffing problems, lack of public cloud visibility among top concerns\nAmong the key findings of Uptime's report:\n\nThe large, privately owned enterprise data-center facility still forms the bedrock of corporate IT and is expected to be running half of all workloads in 2021.\nThe staffing problem affecting most of the data-center sector has only worsened. Sixty-one percent of respondents said they had difficulty retaining or recruiting staff, up from 55% a year earlier.\nOutages continue to cause significant problems for operators. Just over a third (34%) of all respondents had an outage or severe IT service degradation in the past year, while half (50%) had an outage or severe IT service degradation in the past three years.\nTen percent of all respondents said their most recent significant outage cost more than $1 million.\nA lack of visibility, transparency, and accountability of public cloud services is a major concern for enterprises that have mission-critical applications. A fifth of operators surveyed said they would be more likely to put workloads in a public cloud if there were more visibility. Half of those using public cloud for mission-critical applications also said they do not have adequate visibility.\nImprovements in data center facility energy efficiency have flattened out and even deteriorated slightly in the past two years. The average PUE for 2019 is 1.67.\nRack power density is rising after a long period of flat or minor increases, causing many to rethink cooling strategies.\nPower loss was the single biggest cause of outages, accounting for one-third of outages. Sixty percent of respondents said their data center\u2019s outage could have been prevented with better management\/processes or configuration.\n\nTraditionally data centers are improving their reliability through "rigorous attention to power, infrastructure, connectivity and on-site IT replication," the Uptime report says. The solution, though, is pricy. Data center operators are getting distributed resiliency through active-active data centers where at least two active data centers replicate data to each other. Uptime found up to 40% of those surveyed were using this method.\nThe Uptime survey was conducted in March and April of this year, surveying 1,100 end users in more than 50 countries and dividing them into two groups: the IT managers, owners, and operators of data centers and the suppliers, designers, and consultants that service the industry.