This column is available in a weekly newsletter called IT Best Practices. Click here to subscribe.
Last year the world watched in awe as NASA's New Horizons spacecraft sent stunning pictures of Pluto back to Earth. New Horizons had traveled 3 billion miles across the solar system over a decade's time to make its closest approach to Pluto—about 7,750 miles above the surface. That's roughly the same distance from New York to Mumbai, India.
This is quite an impressive scientific achievement. But what if one small bug in the navigation software had sent the spacecraft millions of miles off course? Instead of viewing the mesmerizing Pluto terrain nicknamed "the heart," disappointed NASA scientists would instead be looking at a whole lot of black nothingness. To ensure that nothing like that happens, NASA engineers use a methodology called formal verification to validate every possibility in the spacecraft's software code.
Formal verification is a field of computer science that takes a design and analyzes it mathematically to see if it works before it is deployed in the field. Typically this is around the software, where developers want to see if there are any bugs that would cause problems in the production code or model. In addition to NASA missions, formal verification is also used in other critical fields such as airplane design and microchip fabrication.
Now the science of formal verification is being applied to the process of verifying network changes in order to prevent outages and breaches. Veriflow Systems claims to be the first company to utilize this sophisticated technology to eliminate all change-induced outages and breaches in the network.
An enterprise can build an incredible network that is robust, secure, protected, agile—all those things that the ideal network should be. But it's hard to maintain this ideal state because the network needs to change so often. There are day-to-day changes, like configuration changes or device additions or moves. There are architectural changes, such as implementing segmentation, micro-segmentation or overlays. And then there are transformational changes, such as evolving to virtualized and cloud infrastructures. It's not uncommon for a large enterprise to have upwards of a thousand or more changes to its network in a single month.
Change is good, if it helps the organization meet its business objectives. But change also can introduce misconfigurations of network devices and violations of security policy. One network manager who says his organization makes about 1,500 changes a month admits that .5% to 1% of those changes result in security incidents.
According to IBM and other sources, more than 80% of outages and breaches are due to human factors. The fact is, we humans are a poor judge of whether a configuration change on a firewall or a switch is going to have an impact somewhere on an enterprise network. The complexity is often too much for people to deal with. Technology has to be applied to determine the outcome of a change before it is made and an incident arises.
Veriflow Systems provides mathematical verification of whether policies network-wide are actually realized by predicting possible data flow behavior before it happens. If it looks like a proposed change is going to create a policy conflict or violate standard best practices, Veriflow will provide verification that the change is going to cause a problem. Here's how it works.
The Veriflow solution is contained within a virtual appliance that can be installed on premise or in the cloud. The solution focuses on network infrastructure layers 1 through 4—physical and virtual devices like routers, switches, load balancers, and firewalls. Veriflow says it understands those devices deeply at the data plane.
The Veriflow appliance collects data from each device in the network using read-only accounts. This data is then synthesized to a network-wide predictive model of data flow. The model predicts possible data flow behavior, not just at one device but the whole network as a system.
The challenge here is computational scalability, because the solution is trying to predict a huge number of possibilities. It's not feasible to literally go through each possibility one at a time and simulate what's going to happen with this packet and what's going to happen with that packet. Instead Veriflow does the equivalent by doing it mathematically. The company's core intellectual property is comprised of smart algorithms that reason about the network in a way that covers what could happen. This uses technical advances in verification algorithms like reasoning about equivalence classes of the network and dependencies between changes so Veriflow can do incremental verification.
As a result, Veriflow says it can understand network-wide reality, i.e., how the network is truly configured. For Veriflow customers that have deployed this solution, the vendor says this is often the first time a network or security engineer has been able to see the network-wide reality. They know it is different from that years-old Visio diagram on the wall, but they don't know how, and they don't know where they are vulnerable. Veriflow claims to provide these answers.
The next step is to determine how that reality aligns with the organization's ideal policies for its network. Many organizations don’t have policies fully documented, or haven’t kept the documentation up to date. For those cases, Veriflow has a best practice library of policies as a reference. Veriflow can show the organization its vulnerabilities relative to its own policies as well as to the best practice policies.
The process of formally verifying a network change is interactive. The company says network engineers can see within seconds to minutes if the change he wants to make is going to cause a problem anywhere in the network.
The need for a solution such as this has increased greatly in recent years. The level of complexity, the degree to which networks are being pushed to evolve, and the threat of cyber attacks are putting pressure on enterprises to closely manage their network changes. Veriflow says its use of formal verification takes the doubt out of the results of those changes.