AI adoption is accelerating across telecom and adjacent industries, but the panel at HCLTech’s stand at Mobile World Congress 2026 made a clear distinction: efficiency gains are only the beginning. The deeper shift comes when AI stops being layered onto existing systems and begins reshaping how enterprises operate at their core.
Moderated by Dr. Saikat Chaudhuri, Professor, UC Berkeley, the discussion brought together Loh John Wu, Chief Operating Officer at Axiata Digital Labs; Pavan Ramkishan Bachwal, Head of Mobile Financial Services at Ericsson and Hari Nair, Senior Vice President & Head – Strategic Engagements at HCLTech. Across their perspectives, a common thread emerged: AI-native transformation is less about tooling, and more about the redesign of data, processes, culture and business models.
From productivity gains to revenue lift
For Wu, the starting point has been tangible efficiency. Within Axiata Digital Labs’ development teams, enabling engineers to useAI to reduce the amount of code they write has been impactful, with the organization seeing “about a 20, 30%...efficiency gains” in its own development centers. That productivity shift, however, is not the end goal. It’s a stepping stone.
The more strategic impact lies in personalization and revenue growth. In lower-ARPU markets across Southeast Asia, the focus has been on how to uplift ARPU through more targeted plans and products. Wu pointed to success in Cambodia, where AI-enabled personalization helped drive “almost $5 plus…ARPU,” a significant milestone for an Asian telco.
Beyond consumer segmentation, Wu described how AI is democratizing capability creation itself. What began as a traditional API marketplace evolved into an AI-powered app creation model, where “the next version of that product…called App Maker, is now powered with AI,” allowing businesses to use natural language to describe what they want to build.
AI-native is not just about internal efficiency. Instead, it’s about reducing friction so that more participants in the ecosystem can create and monetise value faster.
Speed, security and segmentation
Bachwal framed impact through two lenses: internal operations and customer-facing services.
On the operational side, AI has dramatically compressed remediation cycles. Processes that once took weeks have been transformed.
Securing vulnerabilities “used to take three weeks,” he said. But “now looking three days.”
The significance is not just productivity. Resilience and responsiveness in environments where zero-day risks and compliance pressures continue to rise.
Externally, the opportunity lies in segmentation and behavioural insight. AI is enriching data and making it easier to detect patterns, enabling differential services and more granular targeting. But he was clear that this only works if the foundations are strong. Before advanced personalization, the organization had to fix its data because, as he put it, “data in junk is data out junk.”
Bachwal also challenged the notion of a standalone AI-native firm. In his view, “there's no independent AI native enterprise…they all need to work in an ecosystem.” For telecom and financial services players in particular, hyperscalers, database partners and operational tooling providers form a critical value chain. AI-native, in practice, is ecosystem-native.
Modernization as a butterfly effect
Nair brought the conversation back to enterprise architecture. One of the most powerful mindset shifts he has seen among CIOs has come through legacy application modernization. It is now possible to compress the timelines of a multi-year transformation effort “with the help of GenAI.” The acceleration changes the economics and feasibility of modernization programmes.
What follows, he argued, is a cascading effect. As application landscapes collapse, operations simplify, supplier ecosystems shrink and infrastructure footprints reduce. He described this dynamic as “a butterfly impact,” noting that a “simple legacy app modernization can change your complete operating model itself.”
Yet Nair was adamant that scaling AI-native capabilities hinges less on the toolset and more on organizational readiness. If there is one lever to prioritize, it is change management.
“Focus on change management. That's the singular aspect which will help in adoption of AI and scaling of AI.”
Technology may be abundant, but adoption requires people to trust, adapt and step back.
Culture, trust and the mindset shift
Across the panel, culture emerged as the primary constraint.
Wu reflected on how even non-technical teams must rethink how they work, recalling how marketing outreach was being done manually when “you can have an AI agent” to filter and prioritize engagement. The challenge is not access to AI. It is habit.
Bachwal pointed to workforce transition: experienced engineers with decades of coding experience must learn to integrate AI without fearing displacement, while newer entrants must balance AI fluency with foundational skills. Blending those cultures is as critical as upgrading infrastructure.
For Nair, the transformation requires a deeper shift in trust. If AI is automating cognition and supporting decision-making, leaders must be willing to let it take a more central role. Without that cultural acceptance, even the most sophisticated deployments will stall.
Governance, power and partnership
When asked what they would change to accelerate progress, the answers underscored how multidimensional AI-native transformation has become.
Bachwal raised a pragmatic infrastructure concern: the transition from CPUs to GPUs and the associated cost and power implications. The challenge is not simply to “rip and replace,” but to design a transitional model that remains sustainable while enabling AI workloads at scale.
Wu focused on governance. With a proliferation of tools, models and platforms, organizations risk fragmentation. Guardrails, whether through LLM gateways or shared standards, are essential to prevent teams from diverging in ways that create long-term complexity.
And Nair returned to organizational alignment. Even in a world of powerful models, success depends on leadership focus and coordinated change.
From efficiency to reinvention
As the discussion closed, a clear synthesis emerged that Chaudhuri summarized. The industry is on an important journey, but there is still significant work ahead. AI has already demonstrated cost efficiencies and productivity gains. The real end game, however, lies in moving from cost reduction to revenue enhancement and from automation to new value creation.
Crucially, the panel reinforced that the technology itself is not the primary constraint. Models are advancing rapidly. Infrastructure is evolving. The harder challenges are strategic and organizational: redesigning business models, reshaping processes, aligning ecosystems and driving cultural change.
Culture and mindset surfaced repeatedly as the biggest barriers. Without leadership conviction and workforce alignment, even the most advanced AI deployments will stall. At the same time, trust cannot be treated as a purely ethical debate. Reliable data, dependable systems and robust governance are prerequisites for sound decision-making. If enterprises can’t rely on the outputs, they will hesitate to embed AI into core workflows.
Finally, scale will not come from trying to transform everything at once. Progress requires focus: selecting a few high-impact use cases, working with the right partners, building ecosystems deliberately and iterating with discipline.
AI-native enterprises will not emerge overnight. But those that move beyond isolated experiments, and instead redesign foundations, operating models and value chains, will define the next era across platforms, networks and industry.





