17 May 2018 12:00 am Views - 1511
To comply with the regulatory requirements, the financial institutions will need to increase their governance in ways which conform to the new compliance requirements, improve the quality of data, the protection of that data and optimize the accumulation of new risk data.
The assessment of risk depends mostly on the properly validated data: data on counterparties, default history, peer data, local and overseas markets and internal operations. According to research, managing data and the infrastructure required to manage the data, take up 7-10 percent of a bank’s operating income.
Data quality
For most institutions, data quality and protection of that data have been low on board’s priorities. The new emphasis on the regulatory risk management means that the governance and integrity of the reference data utilized for holistic risk calculations have now become a critical issue.
The regulators are also forced to focus more closely on data collection, management and systems. They understand that the management’s ability to control the business and quantify and manage risk depends entirely on the quality of the relevant data available and they are, with some reason, becoming more concerned about the poor standards of data management they are encountering.
So, while there is a regulatory push for improvement on one side, it is because there is also a major potential benefit to be secured on the other side in the form of improved business capability.
Risk management is intimately dependent on the issues of data: data integrity, sources, completeness, relevance and accuracy. And even in the smallest financial institution, good risk management depends on the IT architecture and systems used to store and process data.
But for many financial institutions, with multiple aging IT systems or poorly integrated homegrown systems from decades of add-ons, very often find it very difficult to aggregate and report data to support risk management.
Experience
The shortcomings of this practice were harshly exposed by the financial crisis. A key lesson was that large parts of the financial sector were unable to identify and aggregate risk across the financial system and to quantify its potential impact.
Furthermore, exposure could not easily be aggregated across trading and bank books, across geographies and across legal entities. This was because risk management, governance and the underlying data infrastructure were very weak.
As a result, systemic risk was both obscure and underestimated. Unfortunately, many of these challenges for effective risk data aggregation and risk reporting still remain mostly unresolved. The data architecture and IT infrastructure, the accuracy and integrity of data and the ability of banks to adapt to the changing demands for data interpretation and reporting, still remain a big challenge.
However, in the last two years, institutions have progressed towards consistent, timely and accurate reporting of top counterparty exposures, as well as implementing the identified best risk management and data management practices.
Way forward
The weakness in systems and data management has also hampered the ability of institutions and their supervisors to scenario tests and stress tests. The experience of stress-testing has revealed the fact that the systems and processes for aggregating and analysing risk in large financial institutions still remains a big challenge.
Counterparty risk (also known as default risk), the risk to both parties that needs to be considered when evaluating a contract also requires high-quality data. In the final analysis, institutions both in the banking and non-banking sectors going forward needs to avoid ad hoc processes and manual interventions to produce a summary of potential risks and customer data.
Therefore, robust data management and protection infrastructure can largely improve the reliability of the assessments that are produced. In general, there is a long way to go before the industry can produce the required quality of data necessary to satisfy the stakeholder expectations fully.
Therefore, they need to change their operating model and harmonize their systems and data, to finally get to a single point of view.
(Dinesh Weerakkody is a
thought leader)