Practical implementation of FRTB standardised approach and internal models approach

Practical implementation of FRTB standardised approach and internal models approach

By Anthony Pereira, Founder & CEO, Percentile

Could you please tell the risk insights readers a little bit about yourself, your experience and what your current professional focus is?

My passion is for technology and my career has focused on empowering business users to make data driven decisions. Before finance, back in the 90s I was working on data analytics for retail institutions. In 2002 and brought some of my big-data analytics background to the challenges faced by global investment banking trading and risk. Introducing the concepts of centralised risk aggregation and on-demand risk compute enabled Risk Mangers to make faster/better decisions with self-service risk analytics. Doing that required a robust and governed management of all underlying risk data – this is all well before BCBS239. On the back of that, we delivered Basel I, II and III capabilities with extremely compressed time lines and small project teams. This hands-on experience, of having lived and breathed through the pre-crisis boom years and post-crisis regulatory wave, enables us to use technology to reduce the burden and cost of complex implementations like FRTB.

At Percentile, our mission is to use our collective experience and robust technology to help firms address these complex regulations and make risk technology a pleasure to use rather than a hinderance. We’re doing away with end-of-day batch, moving to real-time risk technology and intelligent analytics that go beyond the base requirements of regulations, and return focus to delivering business benefits.

What, for you, are the benefits of attending a conference like the FRTB Forum and what can attendees expect to learn from your session?

Unlike other, more general risk management conferences, the FRTB Forum focuses on a specific topic. The people who attend are interested in making progress and implementing this complex regulation. Our conversations here tend to be productive and insightful.

I hope our session on the Practical Implementation of FRTB – Standardised Approach and Internal Models Approach helps the audience in their implementation phase, as firms move beyond planning and prototypes.

To implement such regulations, one must think about what the end goals are and then what is the required data architecture, the compute architecture and the analysis tools necessary. We will sketch out a practical approach that should enable firms to use the FRTB architecture for more than just this regulation. Ideally the architecture should be flexible to accommodate future changes in this regulation and addition of other regulatory requirements (Stress Testing, SA-CCR etc).

Could you give more detail on tools that can be used for capital optimisation?

This is a pretty important topic and could take up a whole day of conference if need be. Optimisation itself has sub-topics for example, improving desk structure and SA vs IMA decisions, while taking capital efficiency into account. Data quality analytics can be used for improved treatment of RWAs; marginal contribution analysis for pre-trade capital impact; capital reduction by highlighting sets of trades to unwind positions and trade generation for creating capital hedging portfolios.

These use cases can be implemented with a mixture of quantitative and data science techniques. We are dedicating a significant proportion of our R&D time, alongside risk and trading quants to produce analytics in this area. An interesting outcome is that current capital models (VaR based capital) can benefit from such analysis and tools, even ahead the implementation of FRTB models.

In your opinion, why is it so important to source data internally and externally?

Banks have a huge amount of trading data that remains untapped. If collected and maintained along side external sources of timeseries and trade observations, they can build an aggregate pool of data to help assess modellability for IMA NMRFs. We believe that such aggregation of observation data will be necessary to achieve the best picture of modellability, and a number of banks and data vendors think similarly.

Why is creating a centralised pricing utility so important?

FRTB has highlighted that re-using front office models in risk engines with full-revaluation is key to the closer alignment of risk management and front office. We have witnessed this first-hand for over a decade, having operating in that mode.

Even for the Standardised Approach, having a single or centrally orchestrated risk pricing utility, integrated with front office pricing models, will enable sensitivity computation to be consistent with what is seen in those front office systems. For firms with Internal Model ambitions, the same utility can be used to compute the required value-at-risk, expected shortfall and stress scenarios. It is then a natural step to re-use the architecture for all risk pricing, not just for FRTB, and answer many different risk questions.

Architected properly this pricing utility can scale tremendously, particularly if made cloud-enabled to take advantage of on-demand infrastructure and burst capacity.

Such an infrastructure pays dividends for years to come, as it enables firms to think beyond just basic FRTB calculations and get to interesting what-if capabilities with full revaluation, rather than taylor-series expansions. On the whole, it makes for a better architecture, better model outcomes and ultimately better alignment across the firm.

What key point would you like to put across to the audience?

Building a modular, loosely interconnected system rather than a monolithic closed solution is key to remaining flexible in the implementation of both Standardised and Internal Model approaches. Separating the concerns of data management, computation and visualisation into composable modules will allow firms to scale and adapt as volumes grow and requirements evolve. Even for the much simpler Standardised Approach, adopting such an architecture will afford firms tremendous reusability for capital analytics and decision making.

You may also be interested in…

Make your free account