FRTB: Looking Closer Into the New Standard Approach: Curvature Risk

It was not until the last consultative paper was issued in December 2014 that the Basel Committee took a final decision on the new standardised approach for market risk. It decided to implement the “enhanced delta plus method”, which is sensitivity-based, differentiating between three different risk components: delta risk as a foundation for capturing linear risks, and vega and curvature risk as two additional components which apply to products with optionality. Vega risk assesses the risk of price changes based on market expectation on future volatility. Curvature risk captures the non-linear risk, which is not accounted for by delta risk.
Here we will illustrate the concept of curvature risk in an intuitive way.

To be precise, let us consider a sample portfolio consisting of a long stock and two short European vanilla calls, which are almost at the money.[1]
Figure 1 shows the value of the portfolio and its delta approximation depending on the underlying risk factor. Stress scenarios were introduced for the purpose of regulatory capital charge calculations. The scenarios are created by shifting defined risk factors. For equities, the risk factors are spot prices and the shifts are in range from 30% to 70% in spot prices depending on the specific bucket. Let us assume the risk factor of this stock is stressed by 50%.

Graph 1

The portfolio is non-linear because of the call options, so it contributes to the curvature risk.
For the curvature risk framework, we have to consider the value of the delta-hedged position. This is the difference between the fully revaluated stressed portfolio scenario and the delta approximation (the difference between the blue line and the red line in figure 2). Compared to the delta risk framework the difference lies in whether the stress scenario consists of an increase (positive shift) or a decrease (negative shift) in the underlying risk factor (see the different values of the two green circles in figure 2).

Graph 2

When the value of the delta-hedged portfolio is positive for a given stress scenario, this means the delta approximation is conservative and underestimates the potential gain or overestimates the potential loss. On the other hand, if the value of the delta-hedged portfolio is negative (as in figure 2) the delta approximation overestimates gains and underestimates losses.

Thus, only negative values of the delta-hedged portfolio add risk. So, the negative minimum of the values in both scenarios (e.g. in figure 2 the more negative of the two circled values) is used to determine the curvature risk charge. Then, contributions from different delta-hedged portfolios corresponding to the same curvature risk factor are netted. Across different risk factors and risk buckets, correlation matrices apply for diversification. As positive values of the delta-hedged portfolio do not add further risk, they neither count towards the aggregation on risk factor or bucket level nor contribute to any increase in the capital charge through correlation.

As a rule of thumb long gamma strategies do not raise the curvature risk charge whereas short gamma strategies increase the curvature risk charge – but the curvature risk approach adds more complexity than the gamma risk approach.

There are three important things to notice when putting this concept of curvature risk into practice.

First, in contrast to gamma risk, curvature risk is no second order approximation but rather a full revaluation which is needed for every instrument affected. This means that banks have to be ready with regard to infrastructure, data availability and (IT) capacity to run the revaluation for all products with optionality.

Secondly, separate treatment of curvature and delta risk can lead to overly exaggerated capital charges. This can happen especially, but not only, with short gamma strategies. From figure 3 it can be seen that the delta risk charge (red circle) comes from the negative shift, whereas the curvature risk charge (green circle) comes from the positive shift. Thus, two values from mutually exclusive events are added for the aggregated risk charge. This leads to the problem of their sum being significantly higher than the worst case loss (blue circle)!

Graph 3

Thirdly, as mentioned before, in long gamma strategies a positive value in the delta-hedged portfolio cannot offset the delta capital charge. This leads to high capital charges for an imperfectly hedged long gamma strategy even though a long gamma strategy should have a very low worst-case loss. In figure 4 we have depicted the P&L of such a sample long gamma portfolio consisting of two long European vanilla calls almost at the money and a short stock. As can be seen, the delta capital charge is not offset and thus the capital charge is too high.

Graph 4

In summarising, it can be said that computing curvature risk is a complex task which needs full revaluation of all financial products. Because of the separate treatment of curvature and delta risk, this approach can lead to higher aggregated capital charges than expected in some cases.

If you have any questions, do not hesitate to contact us; we would be happy to help you. Why not check out the past blog entry on the effect of overcapitalisation due to data insufficiency and the opening post on the FRTB including the flyer.

[1] This example is inspired by a very similar example given by KBC Bank in its comments to the consultative document “FRTB: outstanding issues”.

The Liechtenstein common-benefit foundation

A mainstay of the Principality’s financial centre strategy Liechtenstein_EN

 

People are increasingly adopting sustainability as a basis for their actions and investments, and the number of charitable foundations and organisations is growing all around the world – nowhere less than in the Principality of Liechtenstein. In 2008, Liechtenstein completely revised its foundation law, an integral component of the Principality’s financial centre strategy. Since then, many things have changed.

Read more here.

FRTB Case Study: Overcapitalisation Due to Data Insufficiencies

Under the upcoming regulation discussed in a series of consultative papers titled ”Fundamental review of the trading book” the methodology for calculating capital requirements is going to change significantly. The required capital will be driven by the risk factors the trading book is exposed to. One interesting aspect of this new method is that risk factors are associated with certain buckets and diversification effects between those buckets are taken into account. However, in cases where it is not possible to allocate a position to a bucket due to insufficient data, these positions are mapped to the so-called residual bucket, to which the maximum risk weight is assigned and no diversification effects are recognised.

Here we are going to look at a sample equity portfolio to learn about how the data quality influences the amount of required capital. Let us start with a Swiss market portfolio whose asset allocation is based on the SMI. The required capital due to equity risk for this sample is 18.7% of the portfolio value. But we additionally determined the degree of overcapitalisation, i.e. the percentage increase of required capital, for all scenarios of insufficient data. The results can be seen in the histogram below.

OC_SMI
Histogram of overcapitalisation due to data insufficiency for an SMI stock portfolio

 

Note that most scenarios yield an overcapitalisation of more than 50%. In fact, assuming data error scenarios are uniformly distributed, the overcapitalisation will be above 50% at a confidence level of 99%! And the expected value of overcapitalisation is 167%, i.e., insufficient data can be expected to more than double the capital requirements in this sample.

Of course, one can say that it is more reasonable to expect that single data errors are quite common, while it is rather improbable that they cluster. So instead of taking the data error scenarios to be uniformly distributed, we should assume a exponential decay with the number of data errors in the scenario. The decay rate is then a measure of data quality directly associated with the number of expected data errors. The graphic below illustrates how a decrease in data quality yields an increase in required capital for our sample SMI portfolio.

EOC_DataQuality_SMI
Expected overcapitalisation by data quality for SMI portfolio

 

It is interesting to see that a low data error ratio of 12.5%, that is 2.5 expected data errors for our sample portfolio, already doubles the required capital.

OK, let us look at a bigger sample equity portfolio based on the EuroStoxx 50. Apart from 15% required capital for FX risks (assuming the reporting currency is CHF), the capital charge for equity risk is 19.5% of the portfolio value. Since the portfolio contains more than twice as many assets, it is tough to consider every data error scenario. But we can specify the data quality (10% expected ratio of data errors) and perform a Monte Carlo simulation. Below the results:

OC_EuroStoxx
Overcapitalisation of a portfolio based on the EuroStoxx allocation due to data insufficencies assuming an expected ratio of data errors of 10%.

 

The expected overcapitalisation for this sample portfolio and data quality is 48% with a standard deviation of 20%.

What do we get from this? Well, due to the structure of the new approach to capital requirements where assets that cannot be assigned to a specific bucket are mapped to the residual bucket, required capital can substantially increase due to insufficient data quality. This is mainly due to the fact that in the residual bucket no diversification effects are taken into account. Additionally, the risk weight in the residual bucket coincides with maximum risk weight of the standard buckets. Since all companies represented in the SMI are located in an advanced economy, their risk weights are usually low. Being forced to map them, the residual bucket thus increases the assumed stress significantly. It is therefore important to check and update your data infrastructure early on to avoid unnecessarily bound capital.

Any questions? Do not hesitate to contact us and check out your previous blog entry.

Fundamental Review of the Trading Book

It is getting serious…

Fundamental changes are coming up for the banking industry. In course of its aim to increase the stability of the financial system, the Basel Committee is going to make substantial changes to the way required capital for market risks is calculated – and after the recently published consultative paper “Fundamental Review of the Trading Book: Open Issues”, it becomes quite clear what form these changes will take.

Not just about the trading book

The Basel Committee aims to specify the boundary between the trading and the banking book more clearly and unambiguously. While still keeping the intention to trade, hedge and profit from short-term price fluctuations as a guiding principle, the Committee gives strict rules and assumptions about which book certain positions must be allocated. In particular, positions which nowadays can safely be held in the banking book might be switched to the trading book in the new setup. Even in cases where no trading book is needed currently, under the new guidelines the need for a trading book might arise.

The new standard approach – taking sensitivities into account

While the Basel Committee originally suggested calculating required capital based on future cash-flows, it now favors a sensitivity-based approach (SBA) due to feasibility concerns from the financial industry. This new approach directly uses risk measures, the sensitivities, to compute the required capital, therefore, the required capital is directly connected to the risk of the portfolio. However, this also poses challenges regarding data availability and quality as all the sensitivities and fundamental information of the instrument for correct bucketing must be available. In particular, missing fundamental data could lead to instruments being mapped to the residual bucket, which would result in a significant increase in the required capital.

Though challenging, the new approach has the potential to increase the stability not only of the financial system but to secure each individual bank. Have a look at the attached flyer to get more detailed information or contact us directly to discuss the upcoming challenges for your business.