Organizations that were innovators and early adopters or had previous implementations began with a focus on incubating new technologies. This led to use case-based deployments that were built from scratch, primarily in the greenfield mode. However, the situation has begun to change.
Enterprises are becoming more distributed even as they continue to remain even more connected. By leveraging cloud-based data driven analytics, these organizations are tapping into new dimensions through which they can leverage data for better insights. In fact, we at HCLTech have used these principles for many of our customers across a number of varied industries, to drive maximum business value from data transformation platforms. Among many others, we are proud to count for a US-based telecommunications giant, a leading market research company, and a Europe based financial services enterprise as satisfied customers.
Today, most enterprises are either in the process of moving or have already moved towards scaling digital across all lines of businesses. This is particularly true in scenarios where analytics and data scaling initiatives need to align to new business models. Organizations are now working overtime to ensure this sync by modernizing their legacy systems by adopting new system architectures and principles.
Clearly, the time has come for organizations with heavy investments in cloud and big data ecosystems to reimagine their traditional architecture. Rather than continuing to create monolithic platforms, companies need to focus on a connected data ecosystem where big data, cloud, hybrid and traditional can coexist in mutually-beneficial harmony.
The New Paradigm
There is little doubt in my mind that we are witnessing a paradigm shift in the digital data ecosystem. The increasing focus on ensuring that data remains federated while staying connected is one of the critical features of this change. This shift has led to the integration of advanced technologies like artificial intelligence and machine learning into data management and operations, leading to increased business value from data.
At first, businesses were simply incubating big data capabilities in dug-in silos for use primarily in their point-use cases. Over time, businesses began to experience the benefits of distributed and connected systems and moved towards cloud data ecosystems offered by players like Azure and AWS. But the overall approach remained siloed and modern cloud-based implementations typically occurred in isolated pockets across the enterprise. Now, however, organizations no longer want to be constrained to a point-use case solution. They want to deploy these ecosystems across the enterprise level.
Our own big data and cloud analytics customers are now asking us to help them scale-up these pocket ecosystems. Today they want to leverage digital across every line of business in their organizations. This is no surprise. After all, data ecosystems connected across digital platforms enable businesses to execute multiple use-cases and potentially generate exponential value from their data assets. In addition to scaling up their digital data ecosystems, future-facing organizations also seek to leverage Agile and DevOps processes across their data ecosystem. And this is what makes this new paradigm such an interesting challenge.
So how do we bring all these potential capabilities together and connect this distributed ecosystem in such a way that it helps companies drive digital innovation & business outcomes?
Data Fabric: Taking Data to Scale
I like to think of this new seamless ecosystem as a Data Fabric, where each thread is a data set with its own capability. Together, these multiple threads are woven together to create a new functionality. This is the entire crux of this paradigm shift. The key goal is to make data more accessible, faster, and actionable. And we can accomplish this through automation, simplicity, repeatability, and discovery.
Adhering to these four tenets can ensure that a large data ecosystem remains stable as it is scaled up and out. From our perspective, a modern Data Fabric architecture pattern is the key to connecting application performance with application functionalities. Businesses can reduce data ingestion and preparation time in one go by utilizing a rapid pattern-based development approach. Thereby, a Data Fabric approach can be used to encompass any scale of data and origin while honoring its unique sensitivity.
In real terms, the Data Fabric approach can allow businesses to speed up their data warehousing processes and allow for a single harmonized visualization of their operations across the enterprise. Additionally, they can also reduce overall costs in the immediate term and benefit from minimal incremental expense as the organization continues to grow.
Case Study: Wireless Operator Deploys Data Fabric to Gain a Single View of Customer
The benefits of this approach are best explained through a real-world example. A leading wireless network operator in the US, with over 70 million customers, was looking for any way to gain an edge on their competition. They partnered with HCLTech to help them achieve this goal. We were well aware that the market was fierce and mature, and that the client needed a data-driven strategy that would allow them to offer unique, customized and tailored offers & promotions to their customers, leading to market disruption.
For this they would need to consolidate over 80 distributed data sources, with over 1.7 petabytes of data, while also managing the daily incremental load of ten billion records. HCLTech helped them establish a next generation solution that wasn’t just another monolithic silo. So, we developed a data lake designed for business intent that was drawn from a number of distributed data lakes within the enterprise.
We employed a Data Fabric approach, like the one described earlier, that connected these disparate sources. Through it, the client was able to ingest, harmonize, and create visualizations of the raw data from across a centralized platform for sharper business insights.
HCLTech also helped the client establish a Transformation Library. This library consisted of reusable data transformation functions, as well as a Data Catalogue that enabled business users across departments and functions to centrally access data sets. A key value addition in this project was HCLTech’s creation of a sand box environment that allowed our client to perform data discovery and execute preliminary advanced analytics.
The results of these transformations were remarkable. The client was able to rapidly compile numerous and disparate data sources and use them to analyze new business models for marketing and operations. Further, the use of an Agile Analytics approach enabled the rapid execution of new promotions from inception to delivery as well as monitoring within a short period. Overall, the client was able to gain a single unified view of their customer, which enabled them to make processes like billing, plans, promotions, and devices a pain-free experience, leaving both customers and the business, delighted.
Scaling New Heights
In an increasingly competitive and cutthroat marketplace organizations need be able to leverage data to make lightning-fast decisions. With petabytes of information amassing over time, companies need this data to become more easily available, accessible, consumable, and actionable - centrally. The complexity of data adoption at scale needs solutions not just within certain pockets but on an enterprise level.
Only through an effective data scaling strategy can organizations gain the right insights and solutions which will allow them to learn fast and act faster. From using social data to gauge the effectiveness of a new product launch campaign, to assessing real-time data generated from devices in clinical trials for testing and proof, and even in providing well-defined and prompt responses to queries for regulatory mandates, fast and accurate data is key.
HCLTech’s digital at scale propositions have already enabled a number of organizations to successfully execute such transformations. With our DevOps based operating model, and our Data Fabric approach, HCLTech’s solutions are aimed towards engineering a large data transformation landscape that also modernizes legacy and traditional data ecosystems. This is the summit of digital transformation that modern enterprises need to conquer.