CCAR — Comprehensive Capital Analysis and Review — remains one of the industry’s most daunting post-crisis reporting challenges. First established by the Federal Reserve in 2011, these tiered capital ratio-based calculations apply to any bank holding company (BHC) with $50 billion or more in assets, and have sent global financial institutions on a long and winding path to compliance.
In the process, many have partnered external providers like AxiomSL to integrate disparate systems together, aggregate and manage the sprawling “data dump” nature of the CCAR process for both data integrity and rules engine flexibility, and ultimately produce the analytics and reports as new waves of requirements gradually become applicable to these firms year by year.
CCAR: An Overview
CCAR is a central facet of US banking regulation implemented following the 2008 financial crisis, requiring systemically important financial institutions operating in the USA to submit capital plans to the Federal Reserve reflecting their balance sheets and how their capital ratios might be affected by stresses of varying sources and severity.
The Fed scrutinizes these plans both quantitatively and qualitatively. On one hand, the firms’ positions and calculation methodology are assessed for complexity and accuracy (certain methodologies, particularly concerning loan-loss, have been active areas for debate); on the other hand, risk management practices, including governance and data quality are also examined.
In short, the exercise is meant to provide a window into the behavior of a bank under duress: Is its book of business sturdy enough to withstand market shocks? Equally important, is it able to self-assess such a risk both frequently and creatively? At the center of both of these questions is data management capability.
Facing CCAR for the first time, one major Intermediate Holding Company (IHC) turned to AxiomSL’s ControllerView data aggregation platform to wrangle 15 different systems in an intense implementation process. Some of these systems, like its consumer credit business, track tens of millions of retail and wholesale customer accounts and others, such its loan-lease servicing, with varying data schedules. The partners also tackled seven years’ depth of historical filing, where owing to the institution’s recent merger history, older source systems’ data presented in numerous different formats (such as: in a form of pipe delimited flat files), making configuration precision and optimization crucial.
A senior executive director who steered the bank through its first CCAR cycle points to the organizational complexity involved—something he has discussed with colleagues across the industry since 2012. Though firms have developed a handful of CCAR approaches over the years, each bank inevitably deals with it slightly differently. “Between different source systems, calculation models and reconciliation requirements that have changed over time, such as FR Y-9C, the lines in the diagrams don’t always represent reality,” he says.
Like many institutions, the bank was able to ease into CCAR after its US-based BHC was granted an initial exemption, relying on its parent’s capital adequacy. This period gave the executive director and his team time to develop the right tactical plan and posture for its first semi-annual CCAR report, in FY-2015, as well as map out a strategic technology approach and target state for down the line.
“As we began examining this, our data infrastructure was Microsoft Access and Excel-based, with information in SharePoint and email involved as well, and we learned very quickly that this needed to be improved,” he says. “We needed a consistent approach for our two operating banks, despite their differences in size, and so ultimately that meant first building a SQL-based database and then a cloud-based tool to pull together validation exercises and the reports in a seamless approache smanner.”
Defining those objectives was the easy part. An initial exercise pulling together BHC data in July 2015 resulted in substantial time spent doing “manual collect and load, copy and paste” tasks. AxiomSL was then engaged from October through the run up to the report submissions in April 2016 which was a monumental task under that timeframe to consolidate multiple source systems data which were received from numerous databases either internal or external into Hadoop environment.
“There were a lot of challenges associated with the tight time period, so a modified approach was introduced for us to get there with AxiomSL’s expertise. Much of this was around reconciliation and the need to keep in sync as the Fed introduced changes, especially about mechanisms for data loading up to the Federal Reserve Board’s site,” he explains. For instance, Form FR Y-14A changed four times during the implementation period, while technical instructions changed six times as the Fed and OCC narrowed down what they wanted. Further, “there was maintenance involved in keeping all of those in alignment,” he adds. “For instance, there’s an XML output for FR Y-14A that AxiomSL is responsible for, and some manual reconciliation between our database and their platform was automated to improve efficiency and time to market. Expertise and training are critical, but it also takes repetition to learn how to navigate this process.”
By deploying AxiomSL’s change management platform, the bank can now claim enhanced data transparency, better analytics and adaptable reporting automation that combines business logic, data lineage and mapping, and a sophisticated permissioning and validation framework. AxiomSL’s streamlined approach to CCAR also allows the financial firm to quickly lift shared underlying data points demanded by other reporting requirements, such as FDIC call reports, reducing duplicative work. Finally, the project elevated enterprise-wide collaboration across finance, operations, tech and risk teams and raised affinity for smart governance practices.
The tool accomplished all of these benefits without the expense of retrofitting and transforming every source system or imposing a common data format — allowing the bank to address CCAR effectively as it further pursues the next strategic technology phase, and continues to interact with regulators, with less operational disruption going forward.
As this project illustrates, even multiple years of CCAR preparation can be insufficient. As most institutions take this risk and reporting requirement on, they find the scope of the data management engineering involved only grows deeper. This is particularly true because of the well as document, validate and trace to the lowest level the entire composition process for the reports, themselves. An effective submission, in other words, is just the start.
“Where we ended up, the integration with the cloud-based tool is underway. We see this as a positive move in the evolution towards our target state,” the executive explains. “As for the resubmission process, top-line adjustments and changes on the fly, having an audit trail in place and edit checks built in for our XML production while achieving faster time to market were all the big reasons why we worked with AxiomSL.