The intelligent network economy: Emerging trends in AI, automation and ecosystem growth

At MWC 2026, a discussion at HCLTech’s booth examined whether today’s networks are prepared for AI-scale workloads and what must change for the intelligent network economy to take shape
Subscribe
3 min read
Nicholas Ismail
Nicholas Ismail
Global Head of Brand Journalism, HCLTech
3 min read
The intelligent network economy: Emerging trends in AI, automation and ecosystem growth

At MWC 2026, Hari Nair, Senior Vice President & Head – Strategic Engagements, HCLTech, and Manish Gulyani, Vice President and Head of Marketing for Network Infrastructure at Nokia, explored a question that often receives less attention than models and GPUs: is the network ready for AI?

AI focus, network gap

Nair opened with a comparison to early railroads. Bigger locomotives did not transform economics on their own; standardizing and interconnecting the rail infrastructure did.

He suggested the AI industry may be in a similar phase; focused heavily on models and compute, while the underlying network foundation remains underexamined.

“All the discussion in all the forums is mostly about compute models and the GPU” he said. “There’s almost no talk about the network,” agreed Gulyani.

Yet AI workloads, particularly as they expand beyond text to image, video and audio, depend on large-scale connectivity. Inferencing workloads often require significant uplink traffic, placing new pressure on performance and capacity.

Citing recent research from Nokia of two thousand experts across telco, cloud and enterprise across Europe and North America, Gulyani noted that “over 80% of the customers responded that they didn’t think the network was ready to take on AI workloads.”

Nokia's Global network traffic report reinforces that concern, estimating that about 70% of global WAN traffic will still come from non-AI sources by 2034. Yet AI traffic will represent the biggest area of growth (23% vs. 15% CAGR) during this time.

The implication: AI scale requires network evolution, not just compute expansion.

Enterprise readiness: Data, programmability and automation

From an enterprise standpoint, Nair identified three structural gaps.

1. Data liquidity

Enterprise data remains siloed, limiting its usability. Organizations must modernize governance and establish a trusted data foundation before AI can operate effectively.

2. Networks remain largely static

Programmability is required to align network behavior with dynamic AI workloads.

3. Operational automation remains incomplete

Moving toward ticketless, automated workflows is critical.

“Right now, it’s still miles to go for an infrastructure to be AI-ready. I would say most of the infrastructure which we have is AI-enabled,” he said.

Autonomous networks: Progress and reality

On autonomous and zero-touch networks, Gulyani said: “The marketing slides claim we are already autonomous,” he said, “but the reality of our customers is [that] most are probably best case, level two, partially level three.”

That typically means rule-based automation and limited closed-loop integration between provisioning and assurance.

True autonomy requires intent-based networking, including defining outcomes rather than configurations, along with digital twins, programmable APIs and highly automated architectures capable of following workloads wherever compute resides.

“We have the tools,” he said, “we just have to get it going.”

From system integration to ecosystem orchestration

As AI ecosystems become more distributed, spanning hyperscalers, connectivity providers, model developers and sovereign frameworks, Nair suggested the traditional role of system integrator is evolving.

Rather than connecting discrete systems, organizations must now coordinate entire ecosystems.

No single player controls the full stack. Network providers deliver connectivity. Hyperscalers deliver compute. Model developers build AI capabilities. Governments impose sovereignty and governance requirements. Orchestrating these layers into cohesive, operational environments becomes central.

Gulyani emphasized that different AI use cases, such as frontier model training, sovereign inferencing and enterprise deployments, require different architectural blueprints. Economic viability depends on optimizing each layer for cost, power efficiency and performance.

“It comes down to cost per token,” he said. “How do you reduce power? How do you reduce the CapEX required?”

What success looks like in five years

Looking ahead, Nair outlined four signals that is meaningful rather than cosmetic.

  1. AI must move into revenue-generating processes, not remain confined to support functions.
  2. Decision velocity must accelerate. “If your decision velocity is still the same old, then I think AI is not really making an impact,” he said.
  3. Cost transformation must be structural, not incremental.
  4. Cultural adoption is essential. “It’s all about people, and a culture and mindset change.”

He described AI as “the mechanizing of thought.” If organizations continue making decisions manually without trusting automated intelligence, transformation remains incomplete.

Network as foundation

The session closed with a shared conclusion: AI cannot scale without network evolution.

Compute innovation alone is insufficient. Capacity, programmability, automation and security must advance in parallel for AI workloads to become economically sustainable.

As enterprises pursue AI-driven growth, the network becomes less a background utility and more a central enabler of the intelligent network economy.

Share
TMT Telecom Article The intelligent network economy: Emerging trends in AI, automation and ecosystem growth