Margining Methodology -

customized and designed specifically for the different asset classes

This Solution Helps

  • Banks
  • Brokers
  • Investment Managers

To enable a trustworthy clearing operation, reasonably conservative margins are required to avoid the risk of the clearinghouse incurring a loss in a default situation.

The margin requirement should theoretically be the market value of the account. However, under normal conditions an account cannot be closed at the instant a participant defaults at the prevailing market prices. It typically takes time to neutralize the account and the value of the account can change during this period, which must be catered for in the margining methodology.

Nasdaq Clearing uses a risk system called Genium Risk. The purpose of a margin requirement system is to calculate accurate risk-based margin requirements for each counterparty account. The margin methodologies are customized and designed specifically for the different asset classes in question to generate accurate valuation and capital efficient calculations to optimize the use of members’ collateral.

Calculation of Risk Parameters for Underlying Instruments

Since neutralizing an account in a default situation can take time, there is a lead-time from the moment a default occurs to the time at which Nasdaq Clearing is able to close the participant’s positions and when necessary, liquidate the collateral that has been pledged.

For most financial products it is assumed that it takes two days on average, but up to five days for more illiquid products to close a counterparty’s positions and liquidate related collateral in the event of a default. For commodity products the liquidation period ranges between 2-5 days. Hence, the margin parameters are calculated with considerations taken to the assumed liquidation period.

An investigation of the underlying instruments for financial products shows that historical price movements are not normally distributed. This could cause concern as knowledge of the distribution could enable an estimation of accurate confidence intervals for the parameters in the margin calculation. Thus, in order to avoid issues assuming normal distribution, Nasdaq Clearing instead uses a numerical method to calculate its risk parameters. This method uses data over a one-year historical period to establish an approximation for the cumulative distribution for equity products. From this distribution the third largest movement is used to determine the appropriate level of risk parameters, equivalent to a 99.2 percent confidence interval for the one-year price history. For Fixed Income products, Nasdaq Clearing also uses data over a longer historical period, 10-year, to establish an approximation for the cumulative distribution. The movement corresponding to a 99.2 percentile movement is chosen (or 99.5 percent for OTC products) for the 10-year price history. The worst of the risk parameter based on one-year and 10-year is applied.

Prices on commodities derivatives’ markets possess the same general properties as other financial instruments, including “heavy tails” and high excess. For margin calculations Nasdaq Clearing applies contemporary statistical methods with use of stable distributions as a model for the tails of relative price increments distribution. Major part of parameters is estimated within one-year data horizon. Parameters of the margining model guarantee that possible losses will be covered by the member’s margin with 99.2 percent confidence.

Genium Risk – The Margining Subsystem

Genium Risk is a subsystem integrated within the Genium INET clearing system, which is used to generate the daily counterparty margin requirements and intraday counterparty margin calculations.

In the Genium INET Clearing system Nasdaq Clearing achieves full integration of the Commodity and Financial markets. Genium Risk is a multi-asset risk management system that integrates OTC and standardized products, hence generating risk offsets and efficiency for members of the clearinghouse. Risk models and risk parameters are customized for different types of asset classes and credit risks. The following sections describe the margin methodologies used for Equities, Fixed Income and Commodities.

OMS-II – Equity Derivatives Margining

OMS-II examines the portfolio as a whole to see how an adverse movement in the value of an underlying instrument would affect the value of the entire portfolio of a given counterparty. It uses a range of inputs in the margin calculation, some of which are specified below.

Market Price Models for Option Contracts

The market value of an option contract at each valuation point is calculated using industry standard valuation models. The market price models used by Nasdaq Clearing to calculate margin requirements are based on the Black-Scholes or the binomial option valuation model for stock options, and Black-76 for index and interest rate options.

Valuation Interval

Different values for the account must be calculated since the market often moves after collateral is pledged until Nasdaq Clearing can close a position in the event of a default situation. To do this, OMS-II varies the price for the underlying security for each series to calculate the neutralization cost. In this way, OMS-II creates a “valuation interval” for each underlying security. The size of the valuation interval depends on the length of the liquidation period and the size of the historic fluctuations in the price over this liquidation period.

Valuation Points

The upper and lower limits of the valuation points represent the worst expected movement (during the lead-time) for the margin calculation. However, the worst-case scenario for a portfolio with different options and forwards/futures based on the same underlying instrument can occur anywhere in the valuation interval. In order to reflect this, the valuation interval is divided into 31 valuation points for equity products.

OMS-II calculates the neutralization cost for each series with the same underlying security in each valuation point; the actual margin requirement is then based on the valuation point that rendered the highest margins, i.e. the worst-case scenario. This means that a portfolio which contains a series for which margins normally would be calculated at different ends of the valuation interval is calculated in the same valuation point. This methodology is justified since the market can only go in one direction at a time (see Correlations Between Instruments below on this page).

Implied Volatility and Volatility Shifts

The risk for a change in implied volatility is taken into account by calculating the neutralization cost of an account by a higher and lower implied volatility than the market implied volatility. Therefore, the neutralization cost is calculated at each of the valuation points for three different implied volatility levels; low, market and high. A typical valuation interval therefore consists of 3x31 valuation points.

Vector Files

OMS-II produces a vector file for each contract cleared. A vector file consists of a data series that is shared by all positions in the series. There are primarily two reasons to produce a vector file. The first is to achieve computational efficiency and the second is that the vector files can be distributed externally so that members can replicate the margin calculations in their own systems.

CFM – Fixed Income Derivatives Margining

Desirable properties of a margin methodology are that it should mirror realistic circumstances, and at the same time be capital efficient. When margining fixed income derivatives, it appears natural to utilize the correlation between different maturities along a yield curve. CFM (short for Cash Flow Margin) is a yield curve-based margin methodology that captures this correlation of fixed income instruments priced against the same curve. Instead of stressing each instrument’s individual price, yield curves are stressed using their first three principal components. All instruments in an account are then evaluated against each stressed yield curve and the margin requirement is given as the combined value of these instruments calculated with the worst of the stressed yield curves. Each principal component (PC) explains part of the historical changes in yield curves. Nasdaq Clearing will use the first three PCs that explain around 90% – 95% of all historical changes to the yield curve.

PC1 is a parallel shift of the yield curve; PC2 is a change in slope; PC3 is a change in curvature. The main steps of the margin model are:

  • Bootstrap individual credit curves from prices of selected instruments
  • Change the entire yield curve to simulate future possible movements of the curve
  • Apply changes the three first principal components (PC) as independent changes to the yield curve
  • A set of hypothetical future yield curves are constructed (the number of hypothetical future yield curves depends on how many sub intervals each PC is divided into. With PC1 divided into 9 sub intervals, PC 2 in 5 and PC 3 in 5 will 9*5*5 = 225 hypothetical future yield curves be constructed)
  • For each hypothetical future yield curve, the market value of a portfolio is calculated
  • The hypothetical future yield curve that leads to the lowest market value for a portfolio is the ”margin curve” for that portfolio


SPAN® - Commodities Derivatives Margining

The Commodities derivatives margin model represents a modification of CME SPAN® and takes into account special properties of traded commodities and features of delivery procedures. For some products the model provides a possibility of 100% margin netting within the same product group dependent on the delivery terms and volumes.

Volatility Curves

Price volatilities on the Commodities derivatives markets are crucially dependent on time to delivery. The volatility structure is formalized as a volatility curve which consists of calculated volatilities as a function of time to delivery for a given market. The volatility of a contract with a given time to delivery and delivery period is then presented as an integral of a volatility curve over the corresponding time period. Volatility curves for the commodity markets are presented on the web-site and are published in a parameter file (SPAN®-file) on a daily basis.

Scenario Analysis

The margin model possesses major properties of the generic SPAN® model and uses a scenario approach to determine margin levels sufficient to cover correspondent portfolio risks. The model takes into account possible price movements as well as changes in volatility, which are formalized in 16 standard scenarios. Features of the commodities derivatives markets and physical properties of delivery procedures suppose special features for portfolio netting and provide dependencies between market prices which have required special modification of the margining system.

SPAN is a registered trademark of Chicago Mercantile Exchange Inc., used herein under license. Chicago Mercantile Exchange Inc. assumes no liability in connection with the use of SPAN by any person or entity.

Correlation Between Instruments

For different instruments that show a high historical price dependency with each other, there is a need for a method that takes this into consideration with respect to margining calculations.

Equities (OMS-II)

For OMS2 this is called the “window method”. In this method, the scanning range limits the individual movement for each series, but there is a maximum allowed difference between the scanning points of the two series. This range can be represented as a window, hence the name. The size of this window is estimated roughly by the same method that is used to estimate scanning ranges. Daily differences between the movements of the series are calculated using one year of data. These values are then used to build a numerical cumulative distribution from which 99.2 percent confidence interval is applied.

Based on a given covariance, the window can display a spread demonstrating the maximum allowable difference in price variation between two different underlying securities. In a narrow window, prices cannot vary as much as in a broad one. As a result, high covariance causes a narrow window, and vice versa.

Fixed Income (CFM)

Yield curves with different credit risks can show a historical relationship. Calculations of allowed correlation between two or more yield curves are based on the strength of the historical relationship between the different yield curves. Each historical curve change can be represented in terms of movements in PC1, PC2 and PC3 by applying a method of least squares solution. The method gives historical time series in terms of movements in PC1, PC2 and PC3. From each curve there will be a daily change in each Principal Components. The window size between each Principal Component is based on the maximum anticipated amount that the Principal Components can deviate from each other in terms of the number of valuation points in the valuation interval. Each day the maximum difference of the changes will be calculated, and the result will end up in a spread vector. The same numerical statistical approach as when calculating the risk parameters is then applied to this vector, i.e. the estimated window size is based on a 99.2% confidence interval. Instruments exposed to the same yield curve will have a logical correlation due to the fact that only one stressed curve per margin account may be chosen in the margin calculations.


Due to economic and physical reasons many commodities derivatives are dependent and show high statistical dependence in price dynamics. This dependence can decrease portfolio risks, which should result in lower margin requirements. The commodities derivatives margin model takes into account dependencies between prices on derivatives with the same underlying (time spread or intra-commodity spread) as well as dependencies between different groups of products (inter-commodity spread).

Dependence between price increments on the derivatives with the same underlying can be estimated based on statistical properties of price processes and features of delivery procedures. The latter defines additional restrictions on the price dynamics of commodities derivatives and provides correlation coefficient to be a major factor for the estimation of the rate of dependence between correspondent price increments. The resulting decrease in margin is defined by the “window method” (see above), where the number of steps aside from the main scenario if defined by the correlation coefficient between price increments for correspondent derivatives (the higher correlation, the lower deviation from the main scenario). Correlation tables for each risk group are published in the parameter file (SPAN-file) on a daily basis.

Margin Simulation

In the Genium INET Clearing back office application and over the open API, a margin simulation facility is provided.

Members can use this facility to get an indicative margin requirement on existing or fictive positions. The margin simulation calculations are based on the same methodology and parameters as the official evening margin calculations. It is however important to note that the margin simulation is based on current real time prices and simulating the same position several times may thus give different results as market prices change. The result of the simulation should be seen as an indication of the official margin requirement.

Margin Back-testing

Comprehensive, automated margin back-testing is a key element of the validation process. Back-testing in general aims at verifying and validating that a model or method meets the requirements that were intended.

The aim of Nasdaq Clearing’s automated margining back-testing functionality is to verify that the margining methodologies that Nasdaq Clearing relies upon are adequate, or in other words, that the margining results are in line with what the margining methodologies are designed to achieve.

Margin requirements are calculated each day by Nasdaq Clearing’s risk margining system Genium Risk. These calculations are based on a number of assumptions, and it is therefore of interest to investigate how well these assumptions hold up to real market events. From a risk management perspective, it is crucial to verify that the margin requirements are not too low. The principle for Nasdaq Clearing’s margin back testing is therefore that for each account and margin date the market value of the account is calculated on the position from the margin date but using actual market prices over the assumed liquidation period following the margin date.

These calculations are done automatically every day for a one-year reference period for all counterparty accounts and all instruments. If the back-tested market value of an account following the margin date is less than the margin requirement, a margin breach is understood to have occurred. It should be noted that given the confidence level which Nasdaq Clearing applies in calculating risk interval parameters, margin breaches of counterparty portfolios are theoretically expected. But since risk interval parameters are calculated per underlying instrument, the diversity factor of a portfolio will determine if a margin breach occurs when a risk interval parameter is breached (a risk interval parameter breach occurs when an underlying price movement is larger than what the approved risk interval parameter accounts for). The margin back testing data that is produced is analyzed more comprehensively on a monthly basis by Nasdaq Clearing and the result is reported to the Risk Committee and the Swedish FSA.


In addition, Nasdaq Clearing performs daily back testing and monitoring of risk parameters. Actual price movements of underlying instruments are compared each day to their corresponding margin parameters. Again, breaches are expected to happen. All breaches are logged and the reason behind the breach is noted. Based on the scale and number of breaches and the underlying reason, Risk Management decides if the risk parameter needs to be recalculated or not.

Resource Center

Related Information

Envision a Future – Fueled by Innovation, Technology and Expertise.
Now, let’s get there. Select below.