As compliance demands more technology, here’s one approach to reporting

February 10, 2016

As financial compliance grows ever more dependent on analytical tools and automated processes, a look at one approach can help illustrate the challenges and the strategies for meeting the demands of a new era.

A regulatory reporting tool — the financial institution reporting engine (or FIRE) — developed recently by consultancy KPMG aims to use complex algorithms to transform previously unstructured data that are difficult to apply into formats that are usable and traceable, and decrease the operational risk by streamlining the reporting process through automation.

Similar tools are being deployed or developed across the industry, including at Thomson Reuters. The technological solutions aim to retire costly legacy systems, reduce headcount at firms, and lower their operational and reputational risk profiles.

Industry practitioners agree that technology is playing an increasingly important role in compliance.

“Although still in the early stages, banks are applying big data and advanced analytics across customer-facing channels, up and down the supply chain, and in risk and compliance functions,” said Michael Shepherd, chairman and CEO of the Bank of West over an interview. Another banker, John Erickson, a director at Zions Bancorporation agreed. “Larger banks are becoming more diverse and complex, and technological change has radically increased the speed of operations and the rate at which data is amassed, stored, processed, and analyzed,” he said.

Importance of data use and its management

One of the key challenges for financial institutions in coping with the increasing regulatory requirements of the post-Dodd-Frank era has been providing timely, accurate, and precise reporting supported by quality, structured data and risk analytics.

The bar for regulators’ requirements and expectations for reporting has continuously been raised since the financial crisis. It was not until 2013, however, that the Basel Committee issued its principles for effective risk data aggregation and risk reporting that came to be the formalized point of reference for regulators.

In the United States, the Office of the Comptroller of the Currency (“OCC”) has issued “heightened expectations” guidelines that called for “policies, procedures, and processes for design, implementation and maintenance of data architecture and information technology infrastructure that support the bank’s risk aggregation and reporting needs.”

The Federal Reserve has also built onto the Basel Committee principles by requesting evidence on data quality controls and reconciliation processes to determine the accuracy of regulatory reports and capital plan submissions, especially within the framework of the CCAR especially.

The issues of data and infrastructure loomed large over banks even before the formalization of the OCC guidelines. In order to comply with various regulatory requirements, and to provide quick and reliable analyses on various key areas –such as aggregate risk exposures, risk concentration across business units, risk limit breaches, capital requirement calculations covering risk areas — banks felt a need to improve on their data management policies, and upgrade their technology capabilities.

FIRE

As KPMG sees it, banks face main four main compliance challenges: changing regulations, operationally risky manual reporting practices, disparate and hard-to-use unstructured data, and rigid, costly reporting structures. The FIRE technology seeks to address the challenges through five “engines.”

The first component, a policy engine, works to update the directory of banks’ policies in each of its risk categories.

Then, a content enrichment framework is used by a second engine to ingest unstructured data –such as internal documentation, frequently asked questions, voice-mails– into mainstream computing through algorithmic reasoning based on SQL and NoSQL database languages, and applying automated validity, syntax, and intra-series edit checks and reconciliation across multiple schedules.

Regulatory rules are periodically monitored and updated by a third engine using government, regulatory and commercial vendor websites.

An orchestration and case management engine uses the stream of data to recognize patterns based on pre-defined and programmed regulatory reports, and devises advanced analytics through a comparison of similar data sets through data mapping and calculation logic.

The fifth, execution, engine applies policy rules to data through intelligent deciphering of unstructured data and integration processes of traditional data.

This process is carried out while keeping tabs on the workflow capabilities of the banks, including activity and data flow tracking, and individual and business unit accountabilities.

As such, the platform works to synthesize different regulatory schedules that draw their figures from same data sets, explain formulae used behind calculations, and allow knowledge confluence through an ontology, which is a meta-data management approach that allows different users to link information and data to facilitate reporting, and oversight.

Traceability and flexibility

For regulatory purposes, the platform also seeks to improve on an area previously seen as daunting for compliance officers: traceability. A Google-like search capability and a process to connect different data points is meant to allow tracing of data to its source (in a model or in bank policy) and to its owner.

The technology is also meant to be configured, according to each bank’s interpretation of applicable rules, including how data are combined and retained.

Impact on operational cost and other areas

The FIRE platform is in its first year of operation, but its impact in reducing bank operating costs is already considered to be visible. One large bank was said able to cut 300 contract staff positions owing to the platform’s ability to ingest and classify source data from unstructured documents.

“The platform also cuts down on the number of hours required for compliance related matters,” said Michael Henry, a principal at KPMG Advisory, overseeing the platform. “An analyst can process 20 to 30 regulatory forms with due diligence checks in an hour using the automated system. Skilled analysts working manually with unstructured documents can only process one or two per hour.”

Similar gains in the KYC on-boarding processes have also observed, reducing the time needed for a medium risk client to under two hours from 27, according to Henry.

In addition to helping banks in their regulatory reporting, such streamlining of data is also seen as useful in other areas, such as in risk monitoring (e.g., picking up unusual trading or money laundering activities), customer profiling, or increasing overall operational efficiency where duplicate systems can be eliminated and manual reconciliations minimized.

FIRE … and ice

FIRE should not be considered as a panacea in addressing all the complexities of regulatory reporting –nor does it aspire to do so. Beyond any technical systems, banks still need to have an able management body with a full understanding of the type of data is collected, how it is processed and used for business purposes, associated risks, and what controls are in place to offset these risks, and one with defined roles and accountability.

Adapting new platforms and tools to legacy systems is another challenge often cited by industry professionals.

Regulatory tools also may suffer from an “exogenous shock” in the form of regulation that may severely limit the use of data for data security or privacy reasons. Brazil has made such a curtailment whereby individuals now have to choose specifically to opt in for data collection purposes.

Otherwise, tracking is not allowed. Similarly, in the EU, regulations require data trackers to selectively forget certain set of data periodically.

In the United States there is no such legislation at the federal level, although a number of state restrictions are in place. However, recent court judgments, such as in a caseinvolving the Consumer Financial Protection Bureau against Ally Bank over the use of selected data in setting consumer loan rates and versus GEICO in consumer insurance, anda Federal Trade Commission case against Wyndham Worldwide in the use of data that constitute unfair and deceptive business practices are precedents that indicate potential greater federal limitations.

What more can be done?

In coping with the issues of big data and building a competent data governance framework, banks can complement a technological platform in ways including:

  • instituting a chief data officer role that can serve as a bridge between risk, regulatory, compliance and IT functions;
  • developing an enterprise-wide management information strategy that decides on the data ownership, and its sharing within the organization; and
  • beefing up the staff with trained and experienced data scientists.

(This article was produced by Thomson Reuters Regulatory Intelligence and initially posted on Feb. 4. Regulatory Intelligence provides a single source for regulatory news, analysis, rules and developments, with global coverage of more than 400 regulators and exchanges. Follow Regulatory Intelligence compliance news on Twitter: @thomsonreuters)

(Bora Yagiz, FRM is a New York-based Regulatory Intelligence Expert for Thomson Reuters Regulatory Intelligence, specializing in risk. He is a certified Financial Risk Manager. Mr. Yagiz has held positions as a bank examiner for the Federal Reserve Bank of New York, as senior consultant with Ernst & Young and vice president at Morgan Stanley. Follow Bora on Twitter @Bora_Yagiz. Email Bora at bora.yagiz@thomsonreuters.com)

No comments so far

We welcome comments that advance the story through relevant opinion, anecdotes, links and data. If you see a comment that you believe is irrelevant or inappropriate, you can flag it to our editors by using the report abuse links. Views expressed in the comments do not represent those of Reuters. For more information on our comment policy, see http://blogs.reuters.com/fulldisclosure/2010/09/27/toward-a-more-thoughtful-conversation-on-stories/