IT leaders -- and the executive teams they report to -- have been bombarded with a virtual “shock and awe” campaign around Big Data. IDC estimates that the 1.8 zettabytes —1.8 trillion gigabytes — of information generated in 2011 will grow by a factor of nine over the next five years. Gartner has a similar take when looking at the segment, predicting that the Big Data market, now valued at $5 billion in revenues annually, will explode to $53 billion by 2016.
The initial reaction, and rightfully so, is “Wow!” and “How in the world are we going to deal with all this?”
While considerable attention has been placed on the three Vs of Big Data — volume, velocity, and variety — the most important aspect has been on the back burner: the actual value to the business.
What will separate the Big Data winners from the losers will be their ability to sift through the mounds of new and emerging data to uncover the few precious nuggets with significant business value. And that will require something that few, if any, companies have today: the ability to make Big Data seem small.
Clearly, there is more — and more valuable — data available to corporations than ever before. But it’s all over the place and in all kinds of formats. Figuring out how to efficiently capture, process and analyze all that information is daunting and, at the moment, virtually impossible — which is why companies shouldn’t even try.
Instead, we’re going to see sophisticated companies creating methods for discerning which types of incoming data are likely to have business value. They won’t form a dozen executive committees, hold endless meetings, and develop five-year plans. They will instead create agile, but relevant and actionable, data blueprints that, overlaid onto their business objectives, will clarify their Big Data analytics initiatives. If a priority is to enhance customer experience, for example, they will focus only on information that can improve the supply chain or time to market or customer service as appropriate. This data blueprint will be mapped to a subset of new IT capabilities that quickly deliver value to the business.
Big Data focus will be different for every company. But one thing is clear: Without that blueprint and frequent iterations to test for business value, there will be some major Big Data disasters. Businesses that continue to be absorbed by the enormity of the task will overspend on data warehousing and capacity. The computing power required to keep pace with the explosion of data will skyrocket.
Meanwhile, companies will have to absorb the opportunity costs associated with spinning their wheels on the wrong priorities. Big Data spending may not only consume the IT budget; it could also become one of the business’s biggest costs. For a preview for those that refuse to focus their efforts and iterate rapidly, just look back at the multimillion-dollar ERP failures, when companies overspent without a value compass.
The distinction between wasted investment and effective transformation — the ability to convert this Big Data opportunity into value — will be focus and agility. That’s what will make Big Data small.
This doesn’t mean that a tight, central group will oversee an enterprise’s Big Data strategy and operations. Instead, the owners and stakeholders of Big Data efforts will expand and change over time — the CIO or CMO one day, the COO or line of business leader the next. Companies will collect, process and analyze Big Data in different places throughout the organization. But in each place, they will maintain a single-minded focus on business alignment and value so that even the largest amounts of information can be made relevant.
Big Data success won’t be a big bang for the enterprise. Those who succeed will take an incremental, iterative approach to unlocking its value over time. Not every company will transform itself into the next Amazon.com, Google or Facebook overnight or ever – nor should they.
But a health insurance company will be able to reduce fraudulent claims. A pharmaceutical company will improve drug efficacy and safety. Manufacturers will create predictive supply chains. Financial services firms will manage risk more effectively. Telecom companies will reduce customer churn. Retailers will master real-time inventory and pricing.
Big Data can effect big transformation, but one focused step at a time.
Vikram Duvvoori heads the Enterprise Transformation Services (ETS) business, HCL, and in addition to delivering technology and process enabled transformation for key clients, he is chartered with scaling incubated propositions in mobility, analytics and customer experience management. He heads both the sales and delivery of his portfolio of services and solutions. He leads a qualified team to collaborate with verticals and other HCL lines of business to provide joint propositions.
In addition to his current role, he is also responsible for the creation of an integrated Business Intelligence and Analytics business unit, which will incorporate BI capabilities currently residing in services lines such as ETS, EAS or industry verticals. He is also responsible for developing specific practices in the areas of mobility, social media and IT strategy, and for those capabilities that currently reside in either EAS or the industry verticals.
Based out of Silicon Valley, he was the founder and CEO of Aalayance Inc. (in Middleware and SOA), which was acquired by HCL. Before starting Aalayance in November 1999, Vikram was the founder and CEO of another Silicon Valley startup, Wyatt River Software, specializing in security and enterprise rights management. It was acquired by a NASDAQ listed public company. His products were sold to NSA, the U.S. Federal Government and a wide range of Fortune 500 companies. Heading Aalayance, Vikram worked with TIBCO to lay the core IT infrastructure for Reliance Infocom, helping them launch Telco services in India and delivering IT services at premium pricing in India.