The movement toward centralizing applications continues to grow stronger. Nemertes' latest research shows 67.7% of companies store their applications centrally. Just one year ago, this figure was at 56%.
We don’t anticipate centralization will ever reach 100%, and it’s close to topping out. About one quarter of organizations have adopted a “hybrid” approach to application architecture, and we don’t expect that to get much lower.
This is because many companies want to store some applications centrally for ease of management. But they still need to put other key applications at remote locations for reliability. For example, retailers may want to keep their point-of-sale application local in case their WAN fails and they couldn’t access the application (and the ability to receive payments ceases).
The real driver for application architecture decisions is how robust the WAN is. Without a reliable, redundant network, IT staffs can’t reasonably place applications or databases in a central data center for fear that remote sites won’t be able to access them.
The real issue behind that driver is cost.
On one hand, it certainly increases WAN costs to install redundant backup lines or contract for 24/7 maintenance programs. The benefit is the ability to centrally store and manage applications.
On the other hand, storing applications remotely at dozens or hundreds of branch locations has its own cost increases. IT staffs must buy more management tools to manage the remote applications, and it usually requires more staff time to troubleshoot multiple application issues to multiple branches.
Obviously, there are pros and cons to each approach. The trend is toward centralization. But with either approach, providing redundancy is crucial.