Containers and microservices: Leveraging AI to make businesses agile, flexible, and scalable | HCLTech

Containers and microservices: Leveraging AI to make businesses agile, flexible, and scalable

Containers and microservices: Leveraging AI to make businesses agile, flexible, and scalable
October 19, 2021

What is on the menu when I think of a container? Does it come as a combo meal, or can I choose to customize and tune the containers to build my services? Let me begin with a short drive through the evolution that the containers have witnessed.

While virtual machines are independent, heavyweight compute machines run on dedicated operating systems. The containers are the lightweight processes that share the OS of the underlying machine. This is the reason why containers are replacing VMs in the world of rapid and continuous development.

Evolution of containers

Figure 1: Evolution of containers

Turning challenges into opportunities

With the emergence of containers, the challenge of managing them on different container technology platforms (other than dockers) was accomplished with the orchestration tools such as Kubernetes and Mesos. The OpenShift Container Platform fills the gaps that Kubernetes has in the areas of monitoring, enterprise log management, integration with CI/CD, etc. Microservices, on the other hand, have brought pace to the application architectural space to address issues pertaining to maintainability, coupling, testing, surgical changes, and more. Containers are best suited for microservices that are lightweight and highly distributed, offering portability and following the develop-test-destroy model. So, migrating to containerized microservices or opting for greenfield deployment are no brainers.

But what about the applications that cannot be containerized? Worry not. A container platform facilitates the coexistence of VMs and containers on the same container platform. Kubernetes is empowered by Custom Definition Resource (CDR) , which enables the external resources to be integrated as if they are native. Kubevirt is one such add-on for Kubernetes through CDR. This provides an API for virtualization with the same mechanism as the other Kubernetes entities, such as ex-pod helping in quick and seamless creation of VMs inside the pods. This way, VMs can run and behave the same way as before and benefit from the Kubernetes capabilities.

Once I have a VM, can I again have a Kubernetes inside that VM? The answer is a yes. This is like a Kubernetes inside Kubernetes. This use case helps create a testing Kubernetes environment for any upgrade, and this loop never ends. We know that the sky is the limit for innovation. Given the exponential increase in digital data with wider variants and trends, data analytics has conquered the static and streaming data space, building intelligence artificially with the advanced, flexible compute engines. The data fabric is an architecture with a set of data services that provides consistent capabilities, harnessing the wealth of current data to create new business value through data analytics.

In other words, it requires a uniform view of the data, irrespective of where the data resides. This is achieved by containers, that include the configuration persistency of the data in containers. Containers also have built-in tools for distributed data access, enabling common data-oriented interfaces that maintain multiple data models. Container platforms such as Kubeflow enable using the ML pipelines to orchestrate complicated workflows running on Kubernetes at a production scale.

Leaders in the oil and gas, automobile, defense, and clinical research spaces have already adopted Kubernetes to accelerate data science workflows and build intelligent applications. Various technology and microservices companies have been continuously collaborating to create additional capabilities, from management, monitoring, to security and access controls, to build business-scale AI applications, which can be deployed into production.

Synergizing containers, microservices, and artificial intelligence

AI and microservices are almost the parallel disruptive technologies that evolved on their own ladders over a decade and were never expected to converge and work together. Today, AI and microservices complement each other.

microservices

Figure 2: How containers, microservices and AI complement each other

AI applications are mostly developed in various technology stacks, programming, and database languages. These applications reap the capabilities of each of these technologies, making it difficult to integrate and manage the services. However, with the inception of microservices, creating independent, loosely coupled, and scalable applications in a diverse technology spectrum have  become easier. With the level of isolation microservices provide, the failure of any service will not impact any other service, expediting the troubleshooting, changes, and testing. This way, microservices enable quicker and parallel development for the AI models.

How can AI strengthen microservices? While microservices are the chosen architecture for any distributed application, they come with challenges, such as the complex nature of its networking, fault tolerance, latency, load balancing, message formats, frequent changes, and managing services on multi-cloud platforms. Innovations in microservices implementation have offered self-healing capabilities for faster recovery through automated remediation. But, auto-remediation uses static thresholds such as message length, response time, wait duration, and average memory utilization. When these thresholds are exceeded, the system will self-heal with the recovery mechanism as configured. However, this is more of a reactive and not a proactive recovery.

This is where AI can aid by analyzing data patterns, making the system intelligent and robust. For instance, with the increased complexity and scale of the microservices, capacity planning is unpredictable and unstable. AI can offer smart resource management and make substantial cost savings without compromising on the stability and robustness of the service. AI can alert the concerned support staff not only when anomalies are observed but also when the system is expected to experience an anomaly. Microservices security is another area where AI is already serving the purpose from the security breach patterns.

The HCLTech advantage

Microservices are used to build AI models in containers, and these AI containers then strengthen microservices. That is how they complement one another. So, an atomic unit container can offer me a “combo meal” of complicated solutions or an “à la carte” of required services with desired agility. However, the scope of the architects does not end here. They are actively researching on ways to address the areas of enhanced automation in containers, centralized management, polyglot services, and state-of-art AI models in the innovative IT world.

Going beyond simple AI microservices, HCLTech Hybrid Cloud has developed a framework called EdgeLITy, which offers IoT solutions for remote and edge locations, and covers operations, security governance, network modernization, and process management. EdgeLITy uses containerized microservices of AI models at the edge compute machines. To know more about EdgeLITy and HCLTech Hybrid Cloud, click here.

Get HCLTech Insights and Updates delivered to your inbox