Traditional datacenter is witnessing massive change due to the arrival of the latest technology. Various new technologies are popping up by the day. IoT Big Data, PaaS and containers, among other latest technology, are being increasingly used. Public cloud is pervasive when we discuss the next-generation data centre built on the latest technology. On the on-premise data center side, enterprises are increasingly looking at adopting software-deﬁned everything to utilize the latest technology.
Experts observed that enterprises must adopt a nimbler and short-term approach to fruitfully exploit new technologies. Organizations following a standard ﬁve-year refresh plan by migrating ‘like to like’ will be at a competitive disadvantage when it comes to adopting new technologies. However, the cost is an important consideration in relation to new technologies. Huge capex invested in technology updates cannot be just written off. The balance between the new technology and cost must be achieved.
Technology Refresh Approach: A Closer Look
In general, enterprises used to just upgrade the EOL assets with a like-to-like technology updates. There used to be no interlock between the application and the infrastructure architects; both used to work in siloes. There used to be no established communication framework to inform the stakeholders on the technology updates. On top of it, an inaccurate asset register made the whole technology-refresh cycle more cumbersome and less productive and less beneﬁcial to business. Enterprises must have a proactive latest IT technology-refresh strategy instead of the waiting for the asset to depreciate over five to seven years.
The primary guiding factors for the latest IT technology refresh are an application modernization roadmap, new business requirements, and technology optimization, which result in signiﬁcant improvement in performance, availability, and recoverability, which may bring business differentiation. Other considerations are cost optimization with a potential reduction in BAU cost and capacity enhancements.
The latest IT technology refresh plan must focus on:
- Discovery and planning: The ﬁrst step in the technology-refresh planning is to assess the servers in scope. This may be done by leveraging the existing data center discovery tools, monitoring tools, CMDB, and existing asset register maintained in spreadsheets. Enterprises may put in a new set of discovery tools, which provide more accurate asset database and also give insights into application-to-infrastructure dependency and app-to-app mappings. This asset database would act as the single source of truth in the whole project.
Furthermore, after ﬁnalizing the baseline volume, workshops for business need analysis and risk assessment are conducted. A key element of the technology-refresh process is to address every phase of the technology lifecycle so that future business needs, technology requirements, ﬁnancial considerations, and expansion plans are anticipated and addressed from the start.
- Target technology selection: The most crucial part of technology management is to select the target technology platform selection.
Due diligence workshops with the enterprise architect, application architect, and infrastructure architect are required to kick-start this process. Special attention needs to be given to the ‘application roadmap’ for business-differentiating applications. Subsequently, the performance, availability, and recoverability requirements need to be looked at — when choosing the technology platform.
A holistic approach is required to decide on the future technology platform. Hence, we need to look at these processes with various technology-selection lenses.
While upgrading any particular component, we need to look at its potential impact on the other components horizontally and vertically, i.e., we may look at aligning the re-platform strategy with tech refresh—for example, Unix to Linux migration. However, there could be changes which might not have any impact—for example, SAN refresh.
Similarly, we need to decide on the target platform based on the business use cases—whether the application would be hosted on public cloud or private cloud, central location or edge location.
While deciding on the new platform, the following factors must be considered: Cost and technology optimization opportunities, security and compliance requirements, new/improved functionality/capabilities, reduction in administrative expenses, improved performance/ availability/recoverability, and capacity utilization.
In the meantime, the application and infrastructure architects should agree on ‘integrated app-infra refresh calendar’ so as to ensure minimum disruption to the business.
- Technology acquisition: After the target technology and location are selected, we may progress to the PoC or technology-evaluation stage. In this phase, we test the integrations between various components, use cases, validations, and integrations between various locations.
- Project execution: Once we have all the prerequisites, i.e., sanitized asset database, target technology platform, target location, technology evaluation of PoC results, and integrated app-infra refresh calendar, we proceed to write the design documents, move group planning, and subsequently start the migration process.
Communication between the key stakeholders is the most vital part of the project execution phase. The PMO group plays an anchor’s role in ensuring that the deﬁned communication framework is followed. During the discovery phase, the PMO and all key architects must work in resonance. Similarly, during the technology selection phases, the business stakeholder should also get involved to clearly understand the business requirements, the application modernization roadmap, and the performance, and availability requirement. This would help in ﬁnalizing the most optimum future technology platform.