Hyperconverged infrastructure (HCI) is one of the key building blocks of next-generation data centers. Originally, HCI was deployed primarily by small and medium-sized businesses that wanted a faster, easier way to deploy data center technology such as servers, storage and networks. Over the past few years, HCI adoption has skyrocketed and is now being deployed by large enterprises looking to shift to a software-defined model.
Initially, HCI was driven by start-ups, most notably SimpliVity and Nutanix. But recently Cisco, VCE and Hewlett Packard Enterprise (HPE) have jumped into the market, and Juniper and Lenovo have formed a partnership that will likely lead to a combined HCI solution.
All of the vendor activity is a good indicator of the demand for HCI. However, the one topic that hasn’t really been investigated is how HCI impacts IT security.
To fully understand all of the security implications of HCI, I interviewed Bryan Pelham, senior manager of business development at Illumio, one of the hottest security start-ups.
The first thing I asked Pelham was, from his perspective, why there is so much interest in HCI. Pelham said IT is attracted to the cloud because they want the cost benefit and elasticity of it, but they are not thrilled with having to give up control and are concerned with privacy issues. HCI offers the best of both worlds by providing the benefits of the cloud inside the four walls of the data center.
I then asked Pelham how HCI changes the way IT operates. He told me that most companies he talks to want to automate as many processes as they can. However, you can’t effectively automate complex environments. HCI simplifies the data center and makes it more automatable. Also, it creates the ability to easily spin up and spin down compute infrastructure and puts the power in the hands of the developers. IT can now evolve into being a service for the business rather than being the bottleneck. Application developers can now click a mouse and have a server provisioned five seconds later.
We then shifted the discussion to security and how HCI changes things. Before I get into Pelham’s answer, I want to weigh in myself. Security needs to be in a state of constant evolution. Each time an organization introduces something new—think of things like mobility, SDNs, cloud computing and IoT—security needs to change. The fact that HCI changes security requirements shouldn’t be a surprise. The key is to understand what changes and how to adapt quickly.
Evolution of Security
Pelham started off with a bit of a history lesson. Traditional security focused on protecting the perimeter with a big firewall. That was sufficient 15-20 years ago, but in an age where malware and hackers, to name a few interior challengers, are increasingly prevalent, more focus needs to be placed on the inside of the data center. Securing only the perimeter doesn’t work as well anymore. Now we use firewalls to create network segments that protect information and users on one segment from things that happen on another. The irony with this is that networks were designed to enable endpoints to communicate with each other, and now we use the network to prevent communication. There are constraints on segmentation because segments have to be defined by an application, department, user or other factor. This can get complicated in large-scale environments, particularly when changes are required
Micro-segmentation takes the challenges of segmentation and increases it exponentially. What’s needed is a security model that is as flexible as the underlying infrastructure. That means we need security that supports infrastructures that are spun up, spun down and automated.
Pelham said there are really two approaches to secure micro-segmentation. The first is to try and micro-segment the network. The problem with this is that the network isn’t flexible or multi-dimensional, so it can’t be segmented by several factors. This typically requires lots of hardware and is also not application-aware. That is why most of the firewall vendors have been investing in deep packet inspection (DPI). Ultimately this can create a number of bottlenecks and performance problems, which is where there is so much focus on network hardware acceleration.
A better approach is to shift to a policy-based model and attach security to the applications. Think of this as looking at the security problem through the lens of the application versus the network. Policies can be defined based on the attributes and the role the server plays or other factors.
For example, if a company wanted to enable development servers to connect only to other development servers, this would be impossible with network segmentation because an application developer could spin a server up anywhere. With policy-based security, a rule can be created to enable that. Policies can be created to allow each computing instance (workload, server, user, etc.) to talk to any other instance regardless of network segment.
Pelham made it clear that he wasn’t dismissing the value of firewalls, but he wanted to caution security professionals to understand where they can add value. At the perimeter of the network, firewalls have worked and are the best solution. However, in the heart of the data center a policy-based approach can reduce complexity and changes security to meet the demands of HCI. The policies are written in terms of applications and servers and with no focus on ports, VLANs, ACLs, etc.
Policies allows for the security to follow a workload through its lifecycle. When the virtual server is decommissioned, the security rule can then be removed. This is a much simpler approach and can help customers realize the full potential of HCI.