Regulatory Data and its Current Challenges
Regulatory reporting for large banks involves the processing of huge datasets. It involves the collection of data from various sources: the subsequent enrichment, correction, validation, and attestation of the data before that data is made available for the regulators.
With the aid of technology, the ability to capture data from various sources and the ability to visualize data in a natural way has enabled large banks to reimagine the capture, normalization, storage and visualization of data.
Didn’t these organizations have data earlier? Yes! The difference lies in the technology and data visualization tools available now. Technology is the key enabler of data capture, data normalization and storage. These tools have provided a new playing ground for the visualizing of data and data normalization. Here is a quote from John Tukey, an American Mathematician:
“The greatest value of a picture is when it forces us to notice what we never expected to see”- John Tukey
The technology and data visualization tools should enable the businesses to visualize data in a way that forces the business to notice the unseen and act on it.
There are challenges for achieving good quality data for regulatory reporting and management reporting. One of them is: data being sourced from modern and legacy systems, adding to this is the infrastructure which can be fragmented. This results in data quality issues. The other factor is the stage at which visualization is implemented, adhering to regulatory changes is another key aspect which can degrade the quality of data if not implemented on time. In this discussion we will focus on getting data points and their visualizations which can assist the business with decision-making.
Defining the Landscape
Quality data is the basis of the requirement for financial institutions and regulators alike. To achieve this there needs to be a process which is ingrained in the IT solution. This includes:
Data Source: The in-scope data for reporting will be derived from various sources: these sources could be legacy systems; the solution should be able to ingest data from various sources of data with appropriate source checks to identify the data.
Data Normalization and Quality: Data which is ingested should be normalized with data quality checks. In addition to this, the data should be quality controlled by automated mechanisms so that the data meets the requirements of regulatory reporting.
Lineage: The solution should also include linage for the data so that the data is traced, at the granular level, and right to the source.
Quality of data is achieved by incorporating validation checks for the data ingested. The validation checks have to be performed as per the standard business rules for the in-scope regulatory reports. The system should also incorporate exception handling of data, and corrections to exceptions.
The aim of the solution should be to reduce distortion, the very data that is derived from the sources can be easily distorted if it is not referenced and structured. The desired result will be evasive if there is noise in data, so it is important that the system has some inbuilt mechanisms that show/generate exceptions and these exceptions can be corrected and attested accordingly.
Naturalizing the Data
Naturalizing in the current context refers to data being free of noise, in what is called clean data. This data should be ready for visualization. The intent of visualization should be, to present the data which is meaningful, can be interpreted and supports or furthers decision-making.
The system can make use of Data Prism: just like how a prism shows the constituents of white light, similarly a Data prism can show the constituents of the underlying data in visual representations via a data dashboard.
Using this data prism, the data which is visualized into various methods like data dashboard, Graph’s relating to the data, and insights using data points, will aid in decision making.
The application of the Data Prism method is not only restricted to regulatory reporting, it can even be used in instances where the data serves as a key driver for decisions, like in trading for identifying the noise, in retail industry to identify consumer behavior and many more.
In today’s era of data centric analysis, this data prism concept can be implemented for accurate visualization of data to arrive at more meaningful decisions or data driven decision making.