Skip Links

Layered security defenses: What layer is most critical, network or endpoint?

There is little argument that a defense in depth model is the best way to safeguard the enterprise, but which layer is most critical? Some would say the application layer will ultimately emerge as the Holy Grail, and that may prove true down the pike (a future debate?), but here we examine two of the more common approaches, the network layer vs. the endpoint.


John Dix, Network World Editor in Chief, sets up the debates and recruits the experts. Contact him with thoughts and ideas,

The experts

Eric Knapp, director of critical infrastructure markets at NitroSecurity

argues that the network is all seeing, is the one layer that can add up the threats to give the big picture view.

James Lyne, director of technology strategy at Sophos

says the explosion of mobile devices, new work habits and encrypted data make the endpoint the most critical component of a defense in depth strategy.

The Net layer delivers situational awareness

While endpoint security is an important component of a strong defense-in-depth posture, the network layer is most critical because it helps eliminate inbound vectors to servers, hosts and other assets while providing an excellent basis of activity monitoring that improves our overall situational awareness.

When both network and host security are hardened, the resulting security Gobstopper is going to be difficult for attackers to chew on.

This is important because, while endpoint security has improved significantly with the introduction of application whitelisting and other technologies, our systems and devices are simply too diverse and too interconnected to ensure that host security can be deployed 100% ubiquitously and 100% effectively. All it takes is a single chink in the endpoint security armor to create a beachhead for attackers, so having a holistic viewpoint of how everything interacts on the network is critical.

Security quiz

Network security isn't a silver bullet either, of course. Even using unidirectional gateways (the network-layer equivalent to application whitelisting, where absolute protection is provided at the physical layer), there's the chance that a hardened network shell can be bypassed, exposing the gooey interior of networked hosts. However, the network is the common denominator, the nexus of all systems, applications and services. By properly monitoring it, the larger threats are detectable and the hosts themselves are ultimately more secure.

Active protection using standard network security devices such as firewalls and intrusion-prevention systems (IPS) is a start. Network activity monitoring using intrusion-detection systems, network flow analysis and more holistic systems such as network behavior analysis tools, log management and Security Information and Event Management (SIEM) systems rounds out point protection devices and provides a broader threat detection capability.

In other words, network-based security is more than just a layer of defense, it's a keystone to obtaining situational awareness, showing security analysts how all of those discrete host security events relate to each other and to the important security and compliance policies of the company.

When utilized properly, network-layer security information can be used in conjunction with application whitelisting on the host to create something even better. The term "Smart Listing," first coined at a SANS Institute security conference in London, introduced the concept of using security events from application whitelisting agents on the host to complete the feedback loop to network security devices, which typically block traffic based on blacklists, or defined signatures that tell the firewall or IPS what we know is "bad."

When a zero day exploit slips past these blacklist defenses and hit a host protected with some sort of application control, the exploit will be blocked and the details will (hopefully) be logged.

But where did that exploit come from? Was it an insider threat, something more advanced originating from another country? How did it get past the network layer security controls? The only way to answer those questions is to look at the network itself, specifically at the network layer security events, as well as network flow information.

When we see something that is clearly of malicious intent attempting to execute applications on a protected host, we can intuit that the application is malicious and adjust our blacklists accordingly. In other words, we create a "smart list" of what we infer to be malicious, based upon intelligence obtained from the host, but assessed within the context of the network layer.

Only with this level of automated intelligence and network-layer awareness can the most sophisticated attacks be detected and then blocked at the perimeter using network layer security controls. Because if the network lets the attack in, it will eventually find its beachhead: that one desktop, server, printer, or some other device that isn't adequately protected.  

There's a lot of covert, mutating and otherwise sophisticated malware available, so if an attack does successfully land it's going to gnaw away at systems until a weakness is found. When both network and host security are hardened, the resulting security Gobstopper is going to be difficult for attackers to chew on.

NitroSecurity provides both Intrusion Prevention Systems as well as the only Security Information and Event Management system (SIEM) to include integrated network-based application content and database transaction monitoring.

View More

It all hangs on the endpoint

Radically changing attack patterns, roaming users, a plethora of platforms that need to be protected and the increasing need to encrypt more data are factors that are conspiring to make endpoint security the critical control for security delivery.

The content based detection techniques that have been used for the past 25 years are increasingly ineffective against this mass of malicious code.

Security SNAFUs

The encryption factor alone mandates the change in thinking. With traffic encrypted at the transport or data layer, network-based inspection becomes unrealistic, keeping network devices from doing their job. The endpoint, on the other hand, is able to see the data pre-encryption, allowing for performance inspection of traffic.

Furthermore, greater context is available at the endpoint for security operations, which is increasingly critical. Today in SophosLabs, we see over 95,000 individual pieces of malicious code every day and find a new infected Web page every few seconds, an astounding increase in quality and quantity of malware over previous years. The content based detection techniques that have been used for the past 25 years are increasingly ineffective against this mass of malicious code. At the endpoint, visibility of the applications, data, behaviors and system health can be used to make more accurate decisions and better proactive protection.

Compare for example, the task of trying to identify and block Skype (say nothing of more tricky malicious code). At the endpoint you simply identify Skype.exe (using a variety of mechanisms - not just name), whereas trying to achieve this in the network you need to decode the packets to find the interesting information within the packet which can be exceedingly challenging given there are thousands of different formats. Often times these can be disguised as other forms of legitimate traffic.

More users are also accessing data and applications from the road, in many cases now directly from cloud services. If the traffic isn't backhauled through the business, network security loses visibility traditionally provided at the perimeter and the fabric of the network. Security capabilities like URL lookup for infected Websites therefore need to be available wherever the device is, even when it is out of the network. Endpoint and cloud-based protection allows this.

Network security, however, is easier to deploy than endpoint security because companies can roll it out in a few places on the network instead of having to deploy on every individual PC. However, when security goes wrong and a device gets infected, endpoint protection offers the ability to clean up malicious code and reverse the damage or remediate problems, something network layer security cannot do.

To be fair, being at the endpoint is a constant battle because a lot of malicious code is designed to disable endpoint security software. Inspection from the network does not have this problem. The good news: malware at the endpoint can be detected as it attempts to infect others or dial home.

All in all, both forms of security are important to protect against the modern threat. Some security functions that were traditionally delivered at the network need to be transitioned to the endpoint for effective performance and compatibility with the new army of roaming users.

Conversely, network solutions can cover devices where agent deployment is not realistic, visiting guests or systems which might have had their endpoint software disabled by malware and where network level attacks and snooping can more readily be identified.

With such a large quantity of malware out there and more targeted attacks, the more layers you run, the bigger the net you spread to catch cyber criminals. Over the coming years, many of the traditions of security will be challenged and changed, but both approaches will continue to offer value.

Traditionally endpoint and network security have been handled as isolated areas by different teams. Increasingly, in response to broader threats and new devices, there are benefits in having them work together to deliver better security. Sharing information between the network, endpoint, and cloud will undoubtedly be the direction of modern security.

Sophos is both an endpoint and network security provider. It believes that both layers are a necessary part of the solution and increasingly need to be joined up and work together to provide a more complete security solution.

View More

What's better for your big data application, SQL or NoSQL?

One of the critical decisions facing companies embarking on big data projects is which database to use, and often that decision swings between SQL and NoSQL. SQL...

Shadow IT: Boon or burden?

Shadow IT is defined as IT systems and/or services brought into the organization without IT approval. While it is often perceived as sneaky and potentially dangerous,...

VDI: Has the time arrived?

Is it time to virtualize and centralize the resources so users can access “the desktop” from anywhere using anything, or does the venerable old workhorse still...

Is IT's influence expanding or eroding?

Some contend that BYOD has taken off because employees want experiences that IT can't deliver, and the BYOD success is a sign that IT's influence is eroding. Poppycock,...

Cloud sourcing: Consolidate suppliers or go best of breed?

If you are going with public cloud services, there is a certain appeal to centralizing your apps with as few suppliers as possible to minimize the management overhead....

View more debates