How one of the leading investment banks is evolving its application-development strategy to adapt to the future of infrastructure Credit: Thinkstock From an infrastructure perspective, Fidelity Investments uses a combination of private cloud hosted in company data centers plus multiple public cloud platforms, leading to the question, how to manage this hybrid infrastructure? One key is being flexible, say Maria Azua Himmel, senior vice president of distributed systems at the 71-year old multi-national with $2.13 trillion in assets under management. Azua is attempting to implement strategies among Fidelity’s application developers to ensure that when new apps are built they can be run in almost any environment, whether it be one of the public clouds the company uses or inside its own data centers. To do this Azua is advocating for the use of application containers and software-defined infrastructure that can be controlled via application programming interfaces (APIs). “Cloud is not about infrastructure,” Azua said in an interview before her presentation at the Open Networking User Group in New York earlier this month. “Cloud is about automation; it’s about the application pipeline, standardizing processes and scaling horizontally.” Cloud is not magic Azua says despite the cloud market’s maturity, there are still misconceptions about it. “A lot of consumers of cloud expect magic,” she says. “They want to go from spaghetti code that is not well defined to a hybrid world.” The way to scale most traditional applications is to use larger servers to increase capacity. Azua says this scaling upwards is an inefficient way to grow because it results in perpetually higher costs. “If you’re going to the cloud, you need to refactor your application,” she says, noting the preferred method is to scale horizontally. This means building out a series of infrastructure components that can be provisioned and managed through APIs and developing applications in a microservices architecture. A key to this is embracing the 12-Factor Application development approach, she says. Adam Wiggins, one of the founders of the popular application development platform Heroku (which was purchased by Salesforce in 2012 for $212 million) created the now-popular 12-Factor process. Concepts include: declaring and isolating dependencies of the application; treating back-end services as attached resources; executing the app as one or more stateless services; allowing apps to be disposable, meaning they can be started up fast and gracefully shut down; offering maximum portability of the app across execution environments; and keeping development, staging and production environments as similar as possible. “Gone are the days of writing code and specifying an IP address of where the app will run,” Azua explains. “That just doesn’t scale.” Developers should build apps that are “componentized,” she says, that are able to run in any infrastructure environment. Inside Fidelity’s app-dev production line So how does Fidelity make this a reality? Azua says the key is developing applications that run in a world of software-defined infrastructure. Applications are written in microservices frameworks, with their dependencies outlined, and the underlying infrastructure requirements clearly defined. Fidelity has not standardized on any one technology, but rather it uses a variety of tools to accomplish this. The company builds its apps in Docker containers. Apps that need to stay on the company’s premises run on an OpenStack private cloud. Amazon Web Services and Microsoft Azure are used for public cloud. Fidelity uses a combination of cloud-native management tools like CloudFormation for AWS, Heat templates for OpenStack, and Terraforms, which runs across both public and private environments. It uses Cloud Foundry as a PaaS layer that spans both public and private clouds, too. But which tools the company uses are irrelevant, she says. “Process trumps tools,” she says. Applications should be built in a certain way, and if they are, it doesn’t matter what underlying technology is used to run or manage them, she argues. There is no set rule to determine where an app will run, but Azua says that, generally speaking, if the application runs 24 hours a day, 7 days a week then Fidelity can run it more efficiently internally compared to the public cloud. For short-term workloads or ones that spike in infrastructure resource needs, the public cloud is a more natural landing spot. But that’s not for the developers to worry about. “The (application development) pipeline is so important,” she says. “If you build the applications in a declarative way then we should not have to worry about what the target environment is.” Related content how-to Doing tricks on the Linux command line Linux tricks can make even the more complicated Linux commands easier, more fun and more rewarding. By Sandra Henry-Stocker Dec 08, 2023 5 mins Linux news TSMC bets on AI chips for revival of growth in semiconductor demand Executives at the chip manufacturer are still optimistic about the revenue potential of AI, as Nvidia and its partners say new GPUs have a lead time of up to 52 weeks. By Sam Reynolds Dec 08, 2023 3 mins CPUs and Processors Technology Industry news End of road for VMware’s end-user computing and security units: Broadcom Broadcom is refocusing VMWare on creating private and hybrid cloud environments for large enterprises and divesting its non-core assets. By Sam Reynolds Dec 08, 2023 3 mins Mergers and Acquisitions news analysis IBM cloud service aims to deliver secure, multicloud connectivity IBM Hybrid Cloud Mesh is a multicloud networking service that includes IT discovery, security, monitoring and traffic-engineering capabilities. By Michael Cooney Dec 07, 2023 3 mins Network Security Cloud Computing Networking Podcasts Videos Resources Events NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe