Big banks fall short on data requirements, but regulators may share in the blame
By Bora Yagiz, Compliance Complete
NEW YORK, FEB. 11 (Thomson Reuters Accelus) – An international study for a bank regulators’ group has found deficiencies in the way banks measured and reported counterparty exposures. But the regulators themselves may share responsibility for the shortcomings, as they have provided little specific guidance for the banks.
The report by the Senior Supervisors Group (SSG) –a forum of senior officials from banking regulatory agencies of several countries– found that 19 participating large banks fall short in some areas of data aggregation and quality. The report was based on the banks’ own self-assessments.
The report noted progress in the timeliness and frequency of submissions regarding trading data. It also observed, however, that banks have been struggling with updating certain critical areas — such as the credit value adjustment (CVA) calculation and data quality assurance under SSG benchmarks. The report specifically noted that increased automation has failed to remedy poor data quality, as frequent data errors have kept occurring.
It is the latest of analyses finding shortcomings in bank data management. But the report urges regulators to do their part, by giving higher priority and committing more time and resources to improving data quality and aggregation. It also encourages regulators to provide banks, in the course of their examinations, with feedback and peer comparison on their data aggregation, reporting and counterparty risk monitoring capabilities.
The financial crisis of 2008 brought to the fore the lack of reliable data, when bankers and regulators alike could not uncover the maturities or terms of Lehman Brothers’ counterparty derivative positions as they scrambled to find out the implications the firm’s bankruptcy would have on other institutions.
Importance of Data
Data quality and issues with its aggregation and validation should not merely represent a routine check-box in compliance officers’ to-do list. There are benefits that extend far beyond fulfilling the letter of the compliance requirements in areas such as collateral management, stress-testing planning or assessing the resolvability of a systemically important bank.
Take the leverage ratio, for instance. Without reliable data on off-balance sheet items such as the derivative contracts, or implicit guarantees such as loan commitments and lines of credit, capital percentage calculation would have little significance. Similarly, if the capital calculation for a bank’s assets is based on poor data, it may give a faulty impression about the safety of that institution.
Data is critical not only for regulatory purposes but also for internal reporting, risk management purposes (i.e. in implementing the policies and practices set at the top of the organization), and identifying, assessing and adequately measuring risk areas.
Additionally, banks need to have reliable data and manage it adequately in order to conduct predictive analytics on customer transactions, account activities, credit card balances and loan portfolios, and to better gauge their preferences, receptivity to marketing programs, and changes in the pattern of their activities.
From the perspective of the banks, revamping the infrastructure represents their biggest challenge.
A 2011 report by the Institute of International Finance estimated that surveyed firms were planning to spend $390 million on average over the following five years on their risk IT and operations in developing and enhancing data warehouses to ensure that all business units can reach the same type of information for analysis and reporting purposes. This represented a roughly 50 percent increase from previous years.
Though expensive, this move from the traditional division-by-division information system to one that encompasses the entire bank organization would yield dividends in the long run. The senior management of the bank could gain access to a high level of visible, transparent and consistent data to properly evaluate the performance of business line, while predictive analysis could be done on a more detailed level, allowing the bank to tap quicker into revenue generating opportunities.
General guidelines provided
Regulators have increasingly become interested in establishing principles of data governance after the Lehman debacle. Within the framework of SSG, a “Top 20” (later changed to Top 50) Counterparty project was set up with the goal of tracking the ability of firms to produce accurate and timely counterparty exposure information across legal entities and products in derivatives, securities financing, traditional lending, and short-term money placement. The project was also intended to allow regulators across different jurisdictions to obtain a clear picture of the aggregate exposure of large firms, capturing potentially dangerous levels of concentrations.
This is in line with the increasingly frequent submission and granular reporting required by the regulators. For example, banks that were previously asked to provide annual and quarterly submissions as part of their comprehensive capital analyses and reviews (CCAR) had to start providing monthly reports recently on certain types of retail portfolios with more than 300 attributes.
Banks also have to demonstrate a clear framework for measuring and monitoring data quality as part of the examinations.
The latest SSG report is not the first to indicate deficiencies in data management. An earlier report from 2010 found that “aggregation of risk data remains a challenge for institutions, despite its criticality to strategic planning, decision making, and risk management.”
The Basel Committee has also noted similar deficiencies last year through its publications on risk management practices. In two of its reports — the “Principles for Effective Risk Data Aggregation and Risk Reporting,” and the “Progress in Adopting the Principles for Effective Risk Data Aggregation and Risk Reporting” – the Committee emphasized the need for the banks to adopt a fast and automated firm-wide risk data aggregation platform with less reliance on manual processes. It also urged banks to upgrade their IT systems and make progress in the accuracy, completeness, timeliness and adaptability of risk data at the bank group level as well as across business lines and legal entities, and by asset type, industry, region and other groupings.
Not much specific guidance
Establishing general principles for data is not the same as ensuring their quality and reliability, however. Only a few concrete steps have so far been taken to strengthen risk data aggregation capabilities. These include the Financial Stability Board’s current development of a new common data template for the major banks, and the development of the Legal Entity Identifier system. The former would help with collecting and pooling consistent information on aggregate bilateral credit exposure and the measurement of other relevant risk factors, while the latter would allow instant identification of the parties of a financial transaction.
“Regulators’ neglect of specific guidance on improving data quality, its collection, aggregation, and validation may well mean that all the work on regulatory reform including leverage ratio, capital requirements, and liquidity measures, has been done in vain,” says Mayra Rodriguez Valladares of MRV Associates.
Bad data can cause a cascade of erroneous analyses or regulatory findings, decimating the outcome of all the hard work accomplished in regulatory rulemaking so far. It is a welcome step forward that the issue finally seems to be moving up in the priority list of the regulators.
(This article was produced by the Compliance Complete service of Thomson Reuters Accelus. Compliance Complete provides a single source for regulatory news, analysis, rules and developments, with global coverage of more than 400 regulators and exchanges. Follow Accelus compliance news on Twitter: @GRC_Accelus)