The energy and utility (E&U) sector seem to have gained a better understanding of how to generate new data-driven business models in the wake of an ongoing digital transformation. While data is a key resource, generating value from data visualization is a challenge for innovating data-driven business models. HCLTech’s Energy and Utility Data Analytics team is exploring the possibility of designing a data and insights-based advanced analytics platform. A PaaS that can support the design process of data-driven business models at a crucial step of connecting data to value propositions. The Data Integration Group (DIG) connects two key elements of the business model – key resources and value proposition – through specific functions based on business needs such as asset performance, load forecast, and capacity planning. Further, the DIG supports an iterative process for the discovery of these elements as a boundary object between business and data science/IT participants of the business model innovation. Based on our formative evaluation, we demonstrate the usability and utility of the DIG.
Generating value from data visualization is a challenge for innovating data-driven business models.
The last few years have seen the strategic importance of information technology and Internet of things (IoT) growing, which is, in turn, causing an ever-increasing number of business innovations led by digital transformation. Organizations have turned away from product-based offerings toward complex and service-oriented business models. As data capabilities are increasing exponentially, they are also causing data-driven innovations in business models and making digital transformation a central component of the business models. Also, research points to an increase of data-driven innovations in commercial and non-commercial fields. The development of data-driven business models (DDBMs) is being driven by this trend and is seeing growing importance in research as early as 2016 by Schüritz and Satzger.
By collecting, extracting, and analyzing data, an organization can drive its development further, which is necessary for survival in the current competitive environment, were statements made in the year 2017 by Hunke et al. When companies do not explore their own potential based on the data available, they tend to risk losing chances and opportunities that can be embraced by their competitors. Nevertheless, organizations that integrate data analytics as one of their key resources can gain a significant advantage compared to the general competition. An example of this development is Rolls Royce. Instead of offering a product-based aerospace jet turbine, they sell a ‘power by the hour’ service. Customers pay only for the hours during which they operate the machine, not for the machine itself. As performance data from turbines are available, Rolls Royce can offer such a service without charging for the maintenance time of the turbine. Such a performance-based service is only possible with data, which are collected by several sensors in the machine.
This may sound very similar to the current state of the E&U industry. During the 1990s, the pure sale of products became increasingly non-profitable for organizations, primarily because the manufacturer was too far away from the customer and did not realize the real demand. Due to this development, more and more organizations decided to offer services related to their products. This shift demonstrated a change in perspective from being organization-centric to customer-centric. Instead of buying products, customers buy services, which in turn creates value for both customers and organizations. The creation of value can be seen as a collaborative process. This can include the customer, the supplier, or other involved partners. The so-called service-dominant logic is the co-creation of value with several actors applying their different competencies to benefit from other actors in the network.
In recent years, the focus of value creation has changed from selling pure products to selling services. Such business models show special characteristics. In comparison to a product-based offering, a service focuses on interaction with customers. The service business models include the customer and partner more than traditional models and enable value co-creation between the customer and the organization. —
As the service-oriented paradigm appeared, new services were developed based on both data science and data from practices such as data-as-a-service or analytics-as-a-service that were discussed in the year 2011 by Chen and later in 2014 by Hartmann. It gave birth to new business models, called data-driven business models (DDBMs), as a subset of service-oriented business models. Such business models constitute the next step from servitization to ‘datatization’ to digitization. A key resource for such a business model is data analytics. However, there is no defined data organization threshold when comparing traditional business models with DDBMs. For this reason, we define data-driven business models as the ones, which use data as a key resource to create new insights for a value proposition for customers.
To support the above-mentioned concept, HCLTech, along with its partners, has built to deliver a Utility Operations Analytics Platform as a Service (PaaS) to enable E&U to transition to a DDMs-based business model. The platform is based on Open-source concept and comes along with their pre-built solutions for electric utilities as a way to accelerate the implementation of key use cases. It can blend in perfectly as an overlay on the proposed Azure-based data analytics platform to ensure accelerated development of current and future use cases that each utility client plans to execute. The following are some of the relevant solutions available on this platform.
Listed below are our solutions addressing standard utility use cases:
- HCLTech UTILITIES OPERATIONAL INTELLIGENCE
This is a fully containerized solution that leverages the underlying partner platform and whose objective is to be the data acquisition pipeline and historian repository of technical data/signals associated with network assets. Operational intelligence consists of multiple interactive processes for real-time and historical data acquisition, pre-processing, storage, analysis, notification, data analysis, visualization, and reporting tools.
Through operational intelligence, any signal coming from the field and any point defined in a SCADA system is available for analysis, archiving, visualization, and reporting. Data can be archived and organized with great flexibility, as well as analyzed and displayed to accommodate the data visualization requirement of the whole utility. Among the most valuable functionalities included are:
Data storage - Raw data storage from SCADA and different digital assets
- Calculated data from raw SCADA system data-aggregated values, average, and max-min values, etc.
- Constants max/min nominal values, and thresholds, etc.
- State of assets as acquired by the SCADA system – on/off, open/closed, etc.
Data organization - Master hierarchical organization of data points as associated with voltage levels such as company, substation, transformers, circuits, and data points
- Association of each data point to its corresponding grid assets
- Specific hierarchical organization of assets such as geographic regions, areas of work of crews, city borders, etc., starting from the master hierarchical organization (asset tree)
- Association of attributes to each asset such as address, GPS coordinates, technical data, manufacturer, model, date of commissioning, etc.
Data visualization - Interactive dashboards
- Structured reports
- Data analysis
- User-defined graphs, bar charts, dispersion diagrams, and several other data visualization tools
- Data tables
- Data can also be exported in formats such as XLSX, CSV, TXT, etc.
Operational intelligence leverages all the data analytics capacities of the platform, applying data models and tools specifically designed for the transmission and distribution functions of utilities. The available interfaces include industrial protocols such as IEC 104, Master/Slave, IEC 101 Master, ICCP Client/Server, OPC DA Client, OPC-XML Client, OPC UA Server, Modbus TCP Master, Modbus Serial Master, DNP3.0 Master, etc.
Operational intelligence can additionally import data from a variety of other sources (including OSISoft-PI) through the above-mentioned protocols and also through API, web services, and flat files. Furthermore, leveraging the platform’s edge computing device component (EDGE), operational intelligence is able to import data from IoT devices directly connected to field assets to enrich the data associated to assets with information that is not usually collected by the SCADA.
- UTILITIES ASSET PERFORMANCE MANAGEMENT
One other solution specifically developed is HCLTech’s Utilities Asset Performance Management (APM).
This is a fully containerized solution that leverages several components of the underlying platform and whose objective is the predictive analysis of the performance of assets and the calculation of their health index and risk matrix.
Data can be collected from the SCADA or directly from data loggers or assets, thanks to the use of Edge Computing. This edge computing software module has the capacity to ingest data through several industrial protocols such as Modbus, DNP, etc., and to process data as this is streamed in. Data, either in its raw format or as processed, can then be passed onto APM for further analysis.
APM, like other applications built on this platform, leverages multiple interactive processes for real-time and historical data acquisition, pre-processing, storage, analysis, notification, data analysis and visualization, and reporting tools.
- Health Index Analysis: APM collects operational data and data collected during maintenance inspections for every electric linear asset and calculates the corresponding health index based on the procedures and methodologies published by Toronto Hydro.
- Risk Matrix: APM also manages the calculation of a risk matrix for the managed assets, simulating the risk of failure and the impact of this over time. It includes the use of maintenance inspections or reconditioning to update the value of risk. APM can also simulate future scenarios to see what the impact over risk can be.
- Predictive Analysis: In addition to the health index and risk Matrix, APM manages model algorithms that simulate the operational parameters of assets using machine learning algorithms. These are models trained with real data that generate empirical algorithms that reproduce the behavior of assets (for example, transformers) under normal operating conditions. These algorithms run during operations, and their output is compared to the real value to spot any deviation between the two. In that case, APM can generate predictive analysis of warnings, alarms, and notify with specific reports.
- UTILITIES METER DATA MANAGEMENT
Utilities Metering Data Management (MDM) is another solution that leverages components of the open-source platform.
MDM is a software system that centralizes the processing, storage, and certification of the metering data, with the capability to automate every task, significantly reducing the need for human intervention. It is designed to communicate with any head-end system to certify the completeness of the measures and their exportation in the format required by third-party systems.
Theft detection is a specific module of MDM referred to as ECL (energy control and losses) that takes into consideration technical and non-technical losses. It requires AMI data with some level of detailed substation data regarding energy and customer information system data regarding billed energy and service orders.
- IMAGE ANALYTICS BASED SOLUTION FOR VEGETATION MANAGEMENT
On the same platform, we have combined a geospatial image analytics solution that can handle remote asset inspection and vegetation management. This represents the largest preventive maintenance expense for utilities while also being the most significant contributor to system reliability. While traditional vegetation and inspection management practices are time-consuming, costly, and not always accurate, there is an increasing pressure to come up with new mitigation approaches to deal with increased threats of wildfires and system outages. The drone-based image analytics solution offers LiDAR, fixed camera, Satellite, and SAR based image collection, data management, and advanced analytics to automatically identify areas of potential encroachment on the ground, along with conductors, or at the pole top.
The solution can calculate the volume of tree trimming required, which benefits utilities by validating labor and equipment costs for work performed and enables utilities to pay for remediation per cubic foot instead of per circuit mile.
References
- Barrett, M., and Oborn, E. (2010). "Boundary Object Use in Cross-Cultural Software Development Teams." Human Relations 63 (8), 1199-1221.
- Beha, F., Göritz, A., and Schildhauer, T. (2015). "Business Model Innovation: The Role of Different Types of Visualisations." In: XXVI ISPIM Conference. Shaping Vrontiers of Innovation. Budapest: Hungary, pp. 1-19.