Old building surrounded by modern ones

5 ways to modernize your legacy applications

Global businesses will spend $3.5 trillion on IT this year, $1.3 trillion of which will go toward enterprise software and IT services. Unfortunately, much of that software and services spend is dedicated to just keeping the lights on — maintaining existing enterprise applications that run the business. That’s a lot of money just to maintain the status quo.

That's why moving legacy applications onto a modern infrastructure, such as cloud and containers, holds great promise for businesses that want to reduce IT spending and convert the savings into a competitive advantage. Modernizing makes sense, if you do it right and know how to avoid the pitfalls.

The basics of container management

The modernization conundrum

Many IT organizations and DevOps teams have embarked on application modernization projects. The problem is that these projects are taking too long and are creating vendor lock-in. Organizations are forced to choose a single cloud or container vendor, which can lead to unexpected (and unplanned) price increases down the road.

Modernizing legacy versions of applications such as SAP, Siebel, Oracle, or PeoplesSoft is hard work, since they and their custom-built brethren were often designed as single, unbreakable monoliths. The applications, including associated data, security, and networking configurations, are tightly coupled with the underlying infrastructure. This tight coupling means it’s difficult to upgrade components of an application individually. Even small updates trigger a long, slow regression-testing process that involves manually setting up a near-production testing environment, along with the appropriate data, configurations, etc. This process can take weeks, even for the smallest changes.

Applications at larger enterprises also tend to live in silos. At a bank, for instance, the retail business unit may have legacy applications installed on completely different infrastructure than a commercial business unit running the same applications. This compounds the testing problem but also makes it difficult for IT to consolidate and optimize its infrastructure budget for platforms that offer the best combination of speed, agility, and cost. Even when applications are deployed in cloud environments, CIOs are wary of vendor lock-in and the specter of unexpected, unplanned price increases.

Finally, managing a diverse portfolio of legacy applications can be challenging for the IT operations team because the tools available to manage applications are either infrastructure-specific (e.g., CloudFormation, VMware) or application-specific (e.g., SAP Landscape Management). Most IT operations teams are quickly overwhelmed with the scope and quantity of tools they must master, not to mention the challenge of managing multiple vendor contracts, all with different pricing, terms, and upgrade schedules. It’s no wonder that CIOs often complain about “tool fatigue” and the hard integration work it takes to weave all these point products together into a cohesive application delivery process.

To overcome these challenges, organizations must change the way they think about modernizing legacy applications. These five ideas that could help.

1. Break down the monolith

Create a model of what the application looks like—comprehensively. Model every individual piece of that application: the network configurations, the storage configurations, the servers, the organization they will use, and how the application will deploy on the servers. Model all of the networking between the individual components. Deconstruct that model into its different building blocks and configurations. Breaking down the monolith into its individual working parts will make it easier to create a virtualized application environment using tools such as containers. While these approaches have been tried before, advances with software-defined infrastructure make this approach possible to implement at scale.

2. Unshackle applications from infrastructure

Enterprise applications must be abstracted and separated from any dependency on underlying infrastructure. Sources of data and the network and security configurations, as well as data, can all be abstracted. By abstracting the functions of an application into components that can run anywhere, it’s possible to then move the application to different infrastructure combinations, without changing a single line of code. Software-defined infrastructure allows you to compose the application from these components, thus achieving complete portability. It’s only through complete portability between cloud environments, container tools, storage options, and servers that IT organizations will break vendor lock-in and gain the flexibility required to move their applications to vendors that offer the best combination of price, performance, reliability, and features.

3. Create context to reduce costs

An organization’s entire application lifecycle is made up of many different application environments—different versions with different deployment flavors. Once an organization separates a legacy application into its essential components, it can maintain a catalog that makes it easy to create an almost unlimited number of new versions of the application. Using this approach, developers, test engineers, and DevOps teams can pick and choose whatever combinations of components they need, or quickly clone existing complex application environments for testing or deployment. This cloning process should take just minutes, since the catalog has all the necessary information required. This also dramatically speeds up integration testing, performance testing, planning, and migration processes.

4. Build security into applications

Application security should not be tacked on after deployment. Doing so slows down continuous delivery processes such as DevOps and creates friction between the DevOps and the security teams. Instead, consider security an essential component of your overall application environment and treat it the same as any other component, by baking it into the application from the start. In this way, organizations can protect legacy applications the instant they are deployed, regardless of the infrastructure used.

5. Integrate tightly with DevOps

Upgrading enterprise applications is much easier when you take a point-and-click, push-button approach through a modern UI, as well as pursuing tight integration with continuous deployment tools and processes such as DevOps.

In addition, enterprise applications should be on-demand, meaning the provisioning of new instances should be hands-free and fully automated. This can be accomplished by integrating virtualized application environments with orchestration tools that most companies are already using to provision infrastructure. The key enhancement being proposed is directly connecting a catalog of portable, virtualized application components directly with the orchestration tools, including the provisioning of storage options, which also need to be fully automated. This is made possible by the fact that modern applications won’t have any dependencies on the specific data stores on which they rely.

Modular equals modernized

A modular view of applications should be the crux of your modernization process. It allows organizations to run and test applications in a virtualized environment and achieve three things. First, organizations can now manage applications at an individual component level. Second, the data on which those applications rely becomes just another building block.

Before modernization, data was just another piece of the monolithic puzzle. Now, modern applications can be infrastructure-agnostic, and they can use infrastructure resources no matter where they are located or how many clouds they use.

The basics of container management
Topics: DevOps