By Dr Peter Mitic, Head of Operational Risk Methodology UK, Santander UK
Peter could not cover all of the questions asked during his session on conduct risk modelling at the New Generation Operational Risk: Europe Summit in London, below he has answered a few left out.
Linked events are handled formally at the date of preparation stage. Events that cover more than one risk class arising from the same risk event are simply recorded as such in our database and corresponding losses will be shared among the risk losses concerned. As a final step in the capital calculation we do a correlation analysis which accounts for, amongst other things, linked events
The capital calculation is merely a window on the control environment. The quality of the controls should be reflected in the results of the mathematical results: a tighter control would be expected to reduce capital.
For what type of risks is the formula you had presented on applicable for?
Since usually each different type of risk will follow different pattern especially in cash flow modelling. The Pseudo Marginal method is applicable to data sets comprising one or a few very large losses compared to many other relatively small losses. For example, the largest loss could be as much as 50% of the sum of all of the losses. Conduct risk losses are typical. Cash flows or not because it is very unlikely that a single cash flow will be significantly larger than all the others (if you find one you should investigate immediately). Also cash flows can be negative as well as positive whereas operational risk losses are always positive.
How can you ensure the input data is good quality?
As an organisation, we do our best to ensure that data is of good quality. That is done by a separate group within the Bank, and we take it on trust that they have done their job well. Three years ago we started a large project to make sure that our data is as good as it can be, and that work is maturing this year. We check for anomalies, and if spot anything, we ensure that it’s fixed.
How easy/difficult is it to source data for this model?
The data that we use for the Pseudo Marginal technique is exactly the same as we use for calculating regulatory capital. It’s easy to source as we have a comprehensive network of automated data feeds.
Has the model described been validated / back tested and if so what were your results?
The model will be validated, first by peer review for publication in the Journal of Operational Risk, and then by our internal validation team We only have seven years of good data, so back-testing is not reliable at this stage. However, we have alternative data sets and obtained similar results from them.
What are the benefits to operational risk managers understanding the mathematics behind capital calculation, and in your experience is this something that we need to demonstrate an understanding of?
Operational risk managers should understand the basic principles of the mathematics behind this calculation, in the same way that they should understand the basics of the mathematics behind a traditional capital calculation. If they do, they can appreciate the difficulties of doing the calculation, and are necessarily involved in decision making during the process. For example, some parameter values are set by the risk managers, and they need to know the impact of any decisions they make.
Isn’t this problem specific to PPI type of events?
Libor and other conduct events are in fact single large penalty or fine imposed by the regulator. PPI is a prize example for which the Pseudo Marginal method is applicable, because a single huge PPI loss has a dramatic effect on calculated capital. We have seen other data sets which show similar characteristics, but the calculated capitals in those cases did not merit further analysis. Their calculated capitals were consistent with calculation done by our Finance department.
What are your thoughts on including expected loss for conduct risk modelling?
When we model conduct risk we have the same aims as for any other risk class: to estimate unexpected losses rather than expected loss. That is the essential Basel requirement.
In the LDA how do you handle linked events?
Linked events are handled formally at the data preparation stage. Losses that cover more than one risk class arising from the same risk event are allocated to appropriate risk classes. As a final step in the capital calculation, we do a correlation analysis which accounts for, amongst other things, linked events.
What can capital teams do to ensure that they capture the quality of the control environment in their model and fight the idea that the model is just a mathematical result rather than a management tool?
The capital calculation is merely a window on the control environment. The quality of the controls should be reflected in the results of the calculation: tighter controls would be expected to reduce the calculated capital.