Ravi, can you please tell our Risk Insights’ readers about yourself and your professional experience?
I was with Deutsche Bank where I had held two roles focusing on CCAR Stress Testing. I held the role of Lead Data Architect and was also a Program Manager for CCAR Market Risk focusing on the delivery of Global Market Shock. Prior to Deutsche Bank, I spent several years managing GE Capital’s Debt and Derivatives trading platform. I am well versed in working in an advisory and consulting capacity having worked at CTSH (Cognizant).
You presented at the Stress Testing USA congress where you delivered a presentation on the data challenges specific to CCAR, why do you feel this is a key talking point at the Congress?
Data and infrastructure are key factors in the Fed’s qualitative assessment thus being able to demonstrate that you have a handle on your data challenges is critical to success. To bring up a rudimentary point, the old adage “garbage in = garbage out” readily applies to CCAR and Stress Testing. CCAR, at its essence, is a massive data gathering exercise across multiple functions of a bank. If a CCAR process sources fragmented data of poor quality the stressed output will, as a result, suffer. Further, if a bank is not able to source reliable data, that is consistent with respect to referential integrity, pulling this data together for reporting and reconciling becomes an effort in futility. In order for banks to succeed on CCAR, from a data perspective, they have to demonstrate capabilities to source clean accurate data, with controls and process around said data to establish completeness and traceability.
With the sheer volume and complexity of data required for CCAR stress tests, how can FIs ensure accurate data inputs?
There are several points which come to mind
- Sourcing: Golden Sourcing being adhered to will support referential data integrity across various data streams
- Monitoring & Controls: Data Quality Monitoring and Scrubbing as well as controls around manual, often excel based, processes
- Governance: An organization should truly treat data as an asset; data flows/streams require data stewards and aspirationally the introduction of new data into an ecosystem should have a change management process around it
- Simplification: Organizations should drive simplification across its processes and application footprint i.e. a bank should not have 10 systems that do essentially the same thing.(This does not mean taking short-cuts)
Data is of course the foundation of stress tests; do you see the industry moving towards a smoother more automated collection?
The short answer is yes. To dive deeper, I believe that standards have to be adhered to when architecting solutions from both a technology and data perspective. The fundamentals of Golden Sourcing and referential data integrity across data streams needs to be looked at as the key for data delivery. There also needs to be a push to simplify processes and technology stacks. To come back directly to the question, manual processes should be systematically removed based on how much of a risk they present. To strive for 100% automation is a massive challenge but a continuous process improvement mind-set needs to be put in play to reduce manual spreadsheet based processes over time.
How do you see the role of the Stress Testing professional changing over the next 12-18 months?
In general, I see stress testing continuing to be an emerging space with many opportunities. I believe that RegTech is certainly here to stay, we can anticipate that the EBA, BAFIN and other bank governing entities will start to require more types of stress testing. These entities may build out their own unique scenarios apart from what the Fed is currently prescribing. Also, professionals in the stress testing space will see a continued move from an annual(CCAR) event to more of an operational process where advanced internal stress scenarios are run on a more frequent basis.