Many organizations have a successful legacy of running businesses for decades despite the rapid changes in the technology landscape. In many cases, businesses are enabled by applications running on legacy technologies. However, there are a lot of challenges in keeping them running, on most of the legacy technologies. Although the scope of this article doesn’t require me to detail these, I will list them below so that the reader can appreciate the need for legacy modernization.
Challenges faced and solutions when embarking upon legacy application modernization.
Business Drivers | Technology Drivers |
|
|
A Gartner report published in early 2019 estimates that around 90% of legacy applications would continue to service critical business needs for many enterprises, with less than 10% moving ahead to modern platforms. The primary reason is that there are a lot of challenges that legacy modernization projects face. This article covers them and indicates the solutions to tackle each of them.
Knowledge Deficit
This is the primary challenge in legacy application modernization because:
- Developers no longer exist in the organization or are no longer in touch with the application
- Functional and technical documentation are missing, irrelevant, or not up-to-date,
- Source code is highly-patched
- Sources of third-party applications/libraries used, are not available.
One possible approach to tackle this is to go through the code manually and identify the functionality, which is a humungous task, especially because a lot of code (usually around 30-40% of the entire code base), is redundant or no longer in use. For a leading bank in Australia we are helping in deriving the documentation for a few applications using a home-grown tool. In one of the modernization projects that we delivered successfully for a leading health-care provider in the USA, the knowledge of business users when put together covered hardly 60% of the application codebase. Here we used a tool-assisted approach to convert their existing code, written in mainframe language, atural to Java, now deployed and running successfully. With this approach, the dependency on technical or functional documentation is highly reduced.
Heavy Legacy Ecosystem
Every enterprise running their IT for more than two decades, there is a plethora of applications that exist in legacy technologies even today. In such cases, when management decides to modernize the entire legacy footprint, it impacts the business heavily. When we attempt to take even a single application for modernization, the upstream and downstream applications still would be in legacy technologies, with a lot of legacy protocols, file-formats etc. It is highly risky to take them to the modern platform in one go.
A chunking approach needs to be followed in such cases. We need to take critical ones first and then handle the rest. We may also need to prepare certain tools and design the application to support the legacy ways, such that during the interim, there is no impact on the existing business flow. Once all applications are modernized, support for the legacy footprint could be turned off. This is the approach that we are pursuing in modernizing a large enterprise system for a leading global insurance provider.
Quality Assurance
Another challenge is quality assurance, as the legacy application would have been successfully running well, and the modernized application has to be tested in order to confirm that they are functionally equivalent.
As with every IT project, we need to ensure that all possible test scenarios and test cases are well-defined and well-ahead of the testing phase. The user acceptance test has to be planned well, and having as many business users as possible could help in ensuring soundness and completeness of the modern application. One of the successful factors in legacy modernization projects is to involve business users right from the application discovery phase to maintain quality assurance.
Achieving Performance
Pure legacy systems such as mainframes are highly performant, especially with respect to batch processing. Achieving the same benchmark becomes a difficult task.
In these cases, the legacy applications should be analyzed to replace existing parts of code with relevant COTS products or frameworks. In most mainframe applications the interface between applications or programs, is generally via an intermediate file. These could be re-designed to pass the data directly avoiding any file I/O. Another key factor is to arrive at the optimal hardware sizing. In a pilot that we had performed for one of the leading record-keeping agencies in the US, we had replaced a piece of code, which was like a business rules engines, with a real business rules engine. We also replaced the file-based program interfaces by passing the data across calls. The application was hosted on cloud and demand-based scaling helped keeping costs under control. This helped us to successfully execute a job where 1million records were processed in 1-2 minutes, when the legacy application was able to process 30M records in 4 hours.
Data Migration
Most legacy ecosystem would have had a life longer until modernization. Hence, they would have a lot of data by means of business transactions. Almost in all cases, these run into gigabytes of data, if not petabytes or terabytes, and typically organized across databases and files of various formats (text & binary) etc. In addition to this, the legacy systems could be using EBCDIC or other encoding, whereas most of modern systems operate on ASCII or ASCII-based encodings. This adds another dimension of difficulty in data migration, when not planned well.
In order to take care of differences in encoding, there are tools in the market and also certain ETL plugins that help in migrating the data seamlessly. In case the legacy application is expected to deal with files that contain data with non-contemporary encodings, then we need to have the necessary framework to handle these. VSAM files in the legacy system, could be migrated into SQL or NoSQL databases.
Impact on BAU
During production cut-over, there is usually a longer downtime due to high complexity. When starting the modernization, there could be continuous changes on the legacy source code for bug fixes or enhancements. Also, the UX of the application would have considerably changed, leading to initial reduced productivity of business users.
When the application is modernized in chunks, the downtime could be reduced during roll-over. A code-freeze period should be enforced after baselining. And, once the baseline is modernized, further changes could be incorporated into the target application. Also, when business users are on-boarded into the project right from discovery phase, their buy-in would be ensured, as well as their familiarization with the new UX. Extensive training needs to be planned to ensure that users are ready to work on the new application.
Conclusion
D&A’s ADvantage Modernize methodology offers a wide set of tools and approaches to help us navigate through the challenges that we have detailed above. The following tools that are part of the ADvantage Modernize methodology would come in handy to tackle almost all the challenges listed above.