Incorporating parameter risk into derivatives prices

by Karl Friedrich Bannör

Adequately specifying the parameters of a financial or actuarial model is challenging. In case of historical estimation, uncertainty arises through the estimator’s volatility and possible bias. In case of market implied parameters, the solution of a calibration to market data might not be unique or the numerical routine returns a local instead of a global minimum. This paper provides a new method based on convex risk measures to quantify parameter risk and to translate it into prices, extending results in Cont (Math Finance 16(3):519–547, 2006) and Lindström (Adv Decision Sci, 2010). We introduce the notion of risk-capturing functionals and prices, provided a distribution on the parameter (or model) set is available, and present explicit examples where the Average-Value-at-Risk and the entropic risk measure are used. For some classes of risk-capturing functionals, the risk-captured price preserves weak convergence of the distributions. In particular, the risk-captured price generated by the distributions of a consistent sequence of estimators converges to the true price. For asymptotically normally distributed estimators we provide large sample approximations for risk-captured prices. Following Bion-Nadal (J Math Econ 45(11):738–750, 2009); Carr et al. (J Financ Econ 62:131–167, 2001); Cherny and Madan (Int J Theor Appl Finance 13(8):1149–1177, 2010); Xu (Ann Finance 2:51–71, 2006), we interpret the risk-captured price as an ask price, reflecting aversion towards parameter risk. To acknowledge parameter risk in case of calibration to market prices, we create a parameter distribution from the pricing error function, allowing us to compare the intrinsic parameter risk of the stochastic volatility models of Heston and Barndorff-Nielsen and Shephard as well as the Variance Gamma option pricing model by pricing different exotics.

 

State-Dependent Fees for Variable Annuity Guarantees

by Prof. Carole Bernard

Currently, variable annuity management fees for guarantees are charged at a constant rate throughout the term of the policy. This can create incentives for policyholders to lapse and, possibly effect a new policy. The lapse and reentry problem is a source of expense and lost revenue for insurers. In this paper we explore a new fee structure for variable annuities, where the fee rates supporting the cost of guarantees depends on the moneyness of those guarantees. We derive formulas for calculating the fee rates assuming fees are paid only when the guar- antees are in-the-money, or are close to being in-the-money, and we illustrate with some numerical examples and sensitivity analysis.

  

FVA - Yet another acronym with impact on derivatives pricing

by Dr. Heiko Carstens

The question whether a Funding Valuation Adjustment (FVA) needs to be incorporated in derivatives valuation has been intensively discussed over the last months, both among practitioners in the financial industry and academics in the field of mathematical finance. We show that financial institutions have to account for funding cost in derivatives business. Further, we outline a consistent framework for handing these costs in pricing, steering and accounting. Feasible approaches for tackling the most challenging questions, as how to avoid double counting of FVA and DVA, are introduced.

  

Deterministic optimal consumption and investment in a stochastic model with applications in insurance

by Jun.-Prof. Marcus Christiansen

We motivate and solve the classical financial optimization problem of optimizing mean-variance terminal wealth and intermediary consumption in a Black-Scholes market with the special feature that the consumption rate and the investment proportion are constrained to be deterministic processes.
Mean-variance investment is a true classic since Markowitz. However, the classical stochastic control approach leads to immediately and diffusively investment decisions and consumption and, therefore, does not reflect a sponsor's probable preferences for smooth cash flows. Indeed, in practice, unit-linked life insurance policies often follow control strategies that are rather close to deterministic processes. In control theory, it is non-standard to control a stochastic system by an optimal deterministic control. We show how to adapt the standard theory to this situation, demonstrate the use of the moments of wealth as state variables rather than wealth itself, and derive and present the optimal consumption-investment profile.

  

Capital Requirements with Defaultable Securities

by Prof. Walter Farkas

This is a joint work with Pablo Koch-Medina (SwissRe) and Cosimo-Andrea Munari (ETH Zurich).  We study capital requirements for bounded financial positions defined as the minimum amount of capital to invest in a chosen  eligible asset targeting a pre-specified acceptability test.  We allow for general acceptance sets and general eligible assets, including defaultable bonds.  Since the payoff of these assets is not necessarily bounded away from zero the resulting risk measures cannot be transformed into cash-additive risk measures by a change of numeraire. However, extending the range of eligible assets is important because, as exemplified by the recent financial crisis, assuming the existence of default-free bonds may be unrealistic.  We focus on finiteness and continuity properties of these general risk measures.  As an application, we discuss capital requirements based on Value-at-Risk and Tail-Value-at-Risk acceptability, the two most important acceptability criteria in practice. Finally, we prove that there is no optimal choice of the eligible asset. Our results and our examples show that a theory of capital requirements allowing for general eligible assets is richer than the standard theory of cash-additive risk measures.

 

Risks beyond the models – the rising importance of regulatory, political and legal risk in scenarios of increased financial stress

by Dr. Jochen Felsenheimer

In addition to model-risk, performing arbitrage strategies in practice is exposed to several other risk factors, which are not easily quantifiable, i.e. are not captured or reflected in classic models. Especially in periods of increased stress – as is the case in the European debt crisis – political factors are becoming an important factor for the performance of arbitrage strategies, as was most evident during the Greek government-debt restructuring in 2012. Particularly in the situation of bank restructurings, another risk factor is becoming more obvious: legal risk. Recent case studies that exemplify potential legal risks resulting from bank restructurings have been the Irish IBRC, the Spanish Bankia, and the Dutch SNS Bank.  Last but not least, one reaction of governments to financial crises is to increase regulatory pressure.  This has a direct impact not only on market participants but also on financial instruments, with the latter having a major effect on arbitrage strategies. Consequently, the scope of risk management from a practitioner’s perspective is more than simply accepting models; it is rather to scrutinize the models!

  

Are classical option pricing models consistent with observed option second-order moments? Evidence from high-frequency data

by Prof. Matthias Fengler

We suggest a joint analysis of ex-post intra-day variability in an option and its associated underlying asset market as a novel means of validating an option pricing model. For this purpose, we introduce the notion of option realized variance, by which we mean the cumulative variance realized by the sample path of successive option price observations. In concurrently observing the realized path of the underlying asset, we contrast option realized variance with the realized variance that would be implied from the underlying asset price path under certain model assumptions. In the empirical analysis, we focus on the implied volatility compensated Black-Scholes model and the Heston model. We find that neither model reconciles second-order moments in the option and the underlying asset market. The differences point to the existence of additional relevant pricing factors that affect option second-order moments. We thus corroborate findings made in option data of lower frequency.

 

The limits of Granularity Adjustments

by Prof. Jean-David Fermanian

We provide a rigorous proof of two granularity adjustment formulas to evaluate loss distributions and risk measures (VaR) in the case of heterogenous portfolios and random recoveries. As a significant improvement with respect to the literature, we detail all the technical conditions of validity and provide an upper bound for the remainder term at a finite distance. Moreover, we deal explicitly with the case of discrete loss distributions and multi-factor models. For very simple portfolio models, we prove empirically that the granularity adjustments do not always induce satisfying approximations. This stresses the importance of checking our regularity conditions before relying of such techniques.

 

Modeling and Risk Assessment of Dynamic Hybrid Products in Life Insurance

by Prof. Nadine Gatzert and Dr. Alexander Bohnert

Dynamic hybrid products are innovative life insurance products particularly offered in the German market and intended to meet new consumer needs regarding stability and upside potential. These products are characterized by a periodical rebalancing process between the policy reserves (i.e. the premium reserve stock), a guarantee fund and an equity fund. The policy reserve thereby corresponds to the one also valid for traditional participating life insurance products. Hence, funds of dynamic hybrids that are allocated to the policy reserves in times of adverse capital market environments earn the same policy interest rate determined for the participating life insurance policyholders and, hence, at least a guaranteed interest rate. In this paper, we consider an insurer offering both, dynamic hybrid and traditional participating life insurance contracts and focus on the policyholders’ perspective. Toward this end, we first conduct fair valuation and risk assessment and then assess the position of the policyholders. The results reveal considerable interaction effects between the two contract types within the portfolio that strongly depend on the portfolio composition, thereby emphasizing merits as well as risks associated with offering dynamic hybrids.

 

A General Duality Theorem for Pessimal Risk Agregation

by Prof. Raphael Hauser

An important problem in quantitative risk management is to aggregate several individually studied types of risks into an overall position. Mathematically, this translates into studying the worst-case distribution tails of Ψ(X), where Ψ: R^n → R is a given function, and X is a random vector that takes values in R^n and whose distribution is only partially known. Typical cases include the situation in which only information about the marginals of X is available. Additionally, partial information about the covariances of X or other moments may be known. We introduce a very general modelling framework of optimization problems over measure spaces, give a duality result and discuss how particular cases tranlate to our setting, including the marginal problem and the moment problem. Discretizations of our models can be solved via conic programming.

 

Valuation of flexible generation assets – The shortfall of a spot volatility based real option approach

by Dr. Joachim Hermann

Power plants can be seen as real options on the potential spread. Hence option valuation models can be applied to evaluate the time value of the option. For this valuation, usually spot market price and spot volatility models are applied. The development of new products and the respective liquidity of those products are leading to the possibility to trade the actual profile of the power plant in sequential markets. Therefore for a proper valuation of the power plants not only the market volatility but also the spread between the different markets have to be taken into account when valuating high flexible assets.

  

An extreme value approach for modeling operational risk losses depending on covariates

by Dr. Marius Hofert

A general methodology for modeling loss data depending on covariates is developed. The parameters of the frequency and severity distributions of the losses may depend on covariates. The loss frequency over time is modeled via a non-homogeneous Poisson process with integrated rate function depending on the covariates. This corresponds to a generalized additive model which can be estimated with spline smoothing via penalized maximum likelihood estimation. The loss severity over time is modeled via a nonstationary generalized Pareto model depending on the covariates. Whereas spline smoothing can not be directly applied in this case, an efficient algorithm based on orthogonal parameters is suggested. The methodology is applied to a database of (mostly banking) operational risk losses. Estimates, including confidence intervals, for risk measures such as Value-at-Risk or Expected-Shortfall as required by the Basel II/III framework are computed.

 

Different aspects of recovery rate modelling in credit risk

by Dr. Stephan Höcht

The recent financial crisis gave rise to a new discussion on the market-standard assumption of constant recovery rates in pricing and risk management applications. The increasing number of companies in financial distress, the large variation in realized recovery rates, the high complexity in the credit market as well as increasing regulatory requirements led to a need for a better understanding of all variables involved in the loss process. We give an overview on some basic facts about (implied and realized) recovery rates and their behaviour in different market situations and discuss various modelling approaches.

  

How to detect and model contagion between financial markets

by Prof. Piotr Jaworski

In the last decades, several financial crises have underlined that markets tend to be more dependent during crisis than they are during calmer periods.This extra dependence during times of slump  is often referred to as contagion between markets. If present, it may mitigate the benefits of diversification precisely when those benefits are needed most and have serious consequences for investors. Thus, understanding this highly nonlinear effect is of great interest not only to financial theorists but to practitioners as well. In my talk I will follow the  spatial approach introduced in Durante, Jaworski (2010), which is based on the following characterization of contagion:

"There is contagion from market X to market Y if there is more dependence between X and Y when X is doing badly than when X exhibits typical performance, that is, if the conditional copula of the market returns X and Y, when X is smaller than certain quantile, dominates the conditional copula when X is around its median."

I will present some  methods of detecting the spatial contagion and discuss the choice of the most suitable multivariate GARCH process for modelling this effect.

 

Counterparty Risk & CVA

by Franz Lorenz

Counterparty credit risk and its reflection in the fair value of a derivative via credit and debit valuation adjustments (CVA & DVA) has been a topic of interest amongst practitioners and academics ever since the collapse of Lehman Brothers in 2008. With the new capital requirements regulation (CRR) becoming effective on 1st January 2014, comprehensive standards on how to reflect the risk of losses due to a counterparty’s default and the risk of losses due to changes in CVA in regulatory capital and what to consider when setting up a risk management framework for counterparty credit risk have been introduced. At the same time, new accounting standards on the fair value measurement of financial derivatives (IFRS 13) introduced specific requirements for the calculation of bilateral credit valuation adjustments.  We outline specific challenges that financial institutions are facing in this context and discuss measures to ensure compliance and to tackle methodological complexity.

  

Risk management and it's value in asset management

by Dr. Wolfgang Mader

The current climate of financial repression is characterized by low nominal interest rates and even negative real interest rates in major developed countries such as the United States, the United Kingdom and Germany. In addition, economists and other market participants expect that this low-rates environment will be maintained directly or indirectly for some time.
This leaves investors caught on the horns of a dilemma. They can accept the low-rates environment and – in all probability – miss their return or liability targets. Or they can strive to reach their targets by taking on higher risks, which entail a substantial danger of significant losses. The second is the only truly viable option for investors, yet it presents clear challenges. In times of restricted risk budgets and impending regulatory changes, such as Solvency II and Basel III, increasing outright portfolio risk is impossible for many investors. In addition, investors have become disillusioned with equity returns over the last ten years, often referred to as “the lost decade.” How can this dilemma be solved?
From our perspective, the game plan for investing in a world of financial repression and for solving the investor’s dilemma in described above is a coordinated approach that involves five key elements of "smart risk taking". We show how the different actions impact portfolio characteristics and lead to improved risk and return profiles.

 

Consistent modeling of discrete cash dividends

by Dr. Jan-Frederik Mai

Based on the general idea of considering the stock price as the sum over all future expected discounted dividends, a flexible approach for the modeling of discrete cash dividends is developed. From a theoretical perspective, such a viewpoint automatically implies an arbitrage-free modeling approach and allows for embedding almost any kind of commonly applied stock price model and dividend specification. The practical implications are discussed and a tractable pricing framework is presented. Finally, the generic setup is applied to a defaultable Markov diffusion model in order to highlight the impact of different dividend specifications.

 

Modeling Electricity Spot Prices

by Klaus Mayer

With the liberalization of electricity trading, the electricity market has grown rapidly over the last decade. Compared to other financial markets, the markets for electricity show some characteristics that are challenging to model. In this paper, we propose an approach to model electricity spot prices that combines mean reversion, spikes, and stochastic volatility. Thereby, we use different mean reversion rates for ‘normal’ and ‘extreme’ (spike) periods. Furthermore, all model parameters can easily be estimated using historical data. The parameters are estimated for the markets in France, Germany, Scandinavia, and the United Kingdom, that show the special characteristics of electricity at different extends.

  

Extreme Value Importance Sampling for Rare events and risk measurement

by Prof. Don McLeish

We suggest practical and simple methods for Monte Carlo estimation of tail expectations with Monte Carlo Methods using importance sampling. We argue that a simple optimal choice of importance sampling distribution is a member of the generalized extreme value distribution and, unlike the common alternatives such as Esscher transform, this family achieves bounded relative error in the tail. Examples of simulating rare event probabilities and conditional tail expectations are given and very large efficiency gains are achieved.

 

Quantile Crash Hedging

by Dr. Olaf Menkens

The worst–case scenario portfolio problem which has been introduced by Korn and Wilmott (2002) will be considered in this talk. Although Korn and Wilmott assume that the probability of a crash occurring is unknown, this paper analyses how the worst–case scenario portfolio problem is affected if the probability of a crash arising is known. The result is that the additional information of the known probability is not used in the worst–case scenario. This leads to a q–quantile approach (instead of a worst–case), which is a Value at Risk–measure approach in the optimal portfolio problem.

 

Estimating Frailty Models on Mortality Surfaces: Statistical Challenges and Insights

by Jun.-Prof. Trifon Missov

Fitting mortality models to single cohorts leads to doubtful parameter estimates as standard frailty models disregard yearly improvements in age-specific mortality rates. One option of incorporating such information is to define a two-dimensional frailty model and design an appropriate procedure for its estimation on mortality surfaces. The talk aims at presenting the advantages and limitations of this approach, as well as suggest possible demographic interpretations of results.

 

The Adjoint Method in Computational Finance

by Viktor Mosenkis

Sensitivities of objectives (e.g. prices of options) with respect to (say n) financial model parameters (e.g. discrete interest, dividend, implied volatility) have traditionally been approximated by finite difference quotients (“bumping”). The computational cost of this approach amounts to O(n) function evaluations which will quickly result in infeasible run times (e.g. if one function evaluation comprises  computationally expensive Monte Carlo simulations or if very fine spatial and temporal discretizations are required in the context of numerical methods for the underlying partial differential equations). The adjoint method allows for gradients of scalar objectives to be  computed with machine accuracy (no truncation error as in bumping) at the expense of O(1) (typically between 2 and 5) function evaluations, that is, at a computational cost that is independent of n. Greeks the  evaluation of which has traditionally been performed overnight can thus be practically obtained in a few minutes.

This talk introduces adjoint algorithmic differentiation as the main enabling technique for adjoint methods in finance. We discuss the basic ideas and available tool support. Examples from finance are used to illustrate the power of the adjoint method.

 

Generalized Quantiles as Risk Measures

by Prof. Alfred Müller

In the statistical literature many generalizations of quantiles have been considered so far. Some of these generalized quantiles are natural candidates for risk measures that could be of interest in actuarial science as well. In this paper we investigate in particular the case of M -quantiles. It turns out that they share many properties with risk measures and that the special case of a so called expectile leads to an interesting coherent risk measure that is investigated here in detail. Expectiles can also been seen as an unusual spe- cial case of a zero utility premium principle with a non-differentiable utility function. Robustness properties of expectiles and their relation to classical quantiles are also considered.

 

Modelling the spot price of electricity taking into account fuel prices and renewables

by Jan Müller

Models for electricity spot prices are used for a variety of valuation issues, e.g. pricing of (real) options and pricing on the retail market. These issues require adequate stochastic price models. A major requirement is the reproduction of stylized facts of the spot price of electricity. Furthermore, the dependencies on other commodities need to be modelled as well. In the context of risk management of energy utilities combined models of all relevant commodities gain importance. The complexity of options and of portfolios of energy utilities increases due to the number of products with multiple commodities included. This makes the use of combined models essential.
The spot price of electricity is set by the principle of merit order. The most important drivers for the merit order curve on the EEX market are the grid load, the generation of renewables and the prices of coal, natural gas, oil and CO2 emission allowances. We present a model incorporating these influence factors in a functional approximation of the non-deterministic merit order curve. Apart from the fundamental influence factors the model is capable of reproducing seasonalities, mean-reversion, negative prices and price spikes as observable in historical spot prices.

   

Measuring the model risk of contingent claims

by Jun.-Prof. Natalie Packham

We devise risk measures for the potential losses arising from holding positions that can be hedged only with model-dependent hedging strategies. In a complete and frictionless market, such a measure quantifies model risk, as any observed P&L on a perfectly hedged position is due to hedging in a misspecified model. Model uncertainty is expressed by a set of models, and by weighing the models according to market information such as calibration quality, we derive the distribution of losses from model risk. On this loss distribution, we define value-at-risk and expected shortfall for model risk, and show how these risk measures fit into an existing framework of axioms for model uncertainty. As the risk measures are compatible with risk measures for other risk types, such as market risk, they are suitable for devising appropriate capital charges against model risk. We provide examples that demonstrate the magnitude of model risk.

 

Flexible dependence modeling of operational risk losses and its impact on total capital requirements

by Prof. Sandra Paterlini

Operational risk data, when available, are usually scarce, heavy-tailed and possibly dependent. In this work, we introduce a model that captures such real-world characteristics and explicitly deals with heterogeneous pairwise and tail dependence of losses. By considering exible families of copulas, we can easily move beyond modeling bivariate dependence among losses and estimate the total risk capital for the seven- and eight-dimensional distributions of event types and business lines. Using real-world data, we then evaluate the impact of realistic dependence modeling on estimating the total regulatory capital, which turns out to be, as often expected, up to 38% smaller than what the standard Basel approach would prescribe.

 

Correlations - The importance of being stochastic

by Prof. Luis Seco

Investors wish for correlations between assets to be low, and hope they stay low to ensure proper diversification of their investments. Most periods of financial crises can be traced to a correlation breakdown event, when correlations suddenly jump. The modelling of correlations, a mathematical construct, therefore remains an elusive object of desire in the practice and theory of investing. This talk will highlight the motivation and a variety of models developed with the intent of attacking this problem.

 

Risk and Computation

by Prof. Rüdiger Seydel

Everyday tasks in the financial arena are the simulation of portfolios and markets. This endeavor suffers from various uncertainties. This talk will focus on two kinds of risk mentioned rather rarely. First, we discuss the risk of computational errors.  Occasionally, computer programs give faulty results, and the user does not notice it. More effort is required to balance the computational power with the computational needs, and to control the computational error reliably. The second type of risk is the systemic risk. Financial markets can be considered as dynamical systems depending on parameters. When the parameters change, the equilibria may change their structure drastically. The critical threshold values of the parameters are the "bifurcation points," and the distance in parameter space to the next bifurcation measures the systemic risk. Computational methods have been developed, but the systemic modeling is still at its infancy.

 

How does the market variance risk premium vary over time? Evidence from S&P 500 variance swap investment returns

by Prof. George Skiadopoulos

We use a dataset of over-the-counter S&P 500 variance swap quotes and we explore the time-variation of the market variance risk premiums for different maturity contracts and for different investment horizons. We consider a number of variables that are appealing determinants of the variance risk premium both from a theoretical and an empirical perspective. We find that the market variance risk premium depends on the stock market, economic, and trading activity conditions in a countercyclical way. These relationships hold both in and out-of-sample.  Trading strategies which exploit these properties outperform a naive short volatility strategy even over the 2008 crisis period.

 

Bubbles and Crashes: A Financial System View

by Florian Stärk

Bubbles and Crashes seem to occur in rising frequency in financial markets, in some cases even producing financial crises endangering the stability of the whole financial system. As a by-product, they produce financial time series difficult to model with classical stochastic models and inconsistent with a classical efficient market hypothesis. In this talk, I want to give insight into role of  procyclical mechanisms of the current financial system and monetary policy in the genesis of bubbles & crashes. Also, I will give a short overview how new economic models of the financial system try to capture these instabilities and how regulators try to heal at least some of them.

 

The supOU stochastic volatility model and its estimation

by Prof. Robert Stelzer

The use of stochastic volatility models is very popular in finance, as they allow to capture most of the so-called stylised features of financial price series observed at financial markets. In particular, they model the occurrence of upheated (volatile) phases (at times of "crises") followed by calm phases, where the typical price changes are much smaller, by specifying a latent stochastic process (the volatility) which describes the time-dynamic variance of the price process.
Usually, the empirical autocovariance function of the squared returns decays rather slowly. Thus sometimes one thinks that they exhibit long memory. The supOU stochastic volatility model is capable of producing power decays in the autocovariance function and long memory.
In the talk we first introduce superpositions of Ornstein-Uhlenbeck type (supOU) processes and the supOU stochastic volatility model. Thereafter, we discuss its properties and a concrete specification capable of exhibiting long memory. Finally, we turn in detail to the estimation of the parameters using the second order moments structure of historical data.

 

A Petrov Galerkin projection for copula density estimation

by Dr. Dana Uhlig

The reconstruction of the dependence structure of two or more random variables (d >= 2) is a big issue in finance and many other applications. Looking at samples of the random vector, neither the common distribution nor the copula itself are observable. So the identification of the copula or the copula density can be treated as an inverse problem. In the statistical literature usually kernel estimators or penalized maximum likelihood estimators are considered for the non-parametric estimation of the copula density from given samples of the random vector. Even though the copula itself is unobservable we can treat the empirical copula as a noisy representation, since it is well known that the empirical copula converges for large samples to the copula. Due to the fact that we have only noisy data instead of the copula an appropriate regularization is needed. We present a Petrov-Galerkin projection for the numerical computation for solving an appropriate linear integral equation and discuss the assembling algorithm of the non-sparce matrices and vectors. Furthermore we analyze the stability of the discretized linear equation and discuss regularization methods.

  

Robustness study of the hedging of European claims

by Prof. Michèle Vanmaele

Models admitting jumps seem to fit realistic asset prices more properly than continuous models. Although it is appropriate to take jumps into account, they are not easy to handle. Especially processes with infinite activity complicate simulations. Asmussen and Rosinski (2001) launched the idea to approximate a Levy process by replacing the jump part existing of the small jumps (with size smaller than ε < 1) by a scaled Brownian motion. The scale is given by the standard deviation of the small jumps. For epsilon tending to zero, this approximation clearly converges in distribution to the original Levy process. The question rises how this approximation inuences the price and hedging strategies of European options on assets modeled by exponential Levy processes. In other words, we question the robustness of the hedging strategies towards the choice of the model and we derive an estimation of the model risk.
For the pricing we apply the Fourier approach as described in Eberlein et al. (2010). Because of the presence of jumps, the market is incomplete and there exist many martingale measures. We focus on the Esscher transform, the minimal entropy martingale measure and the minimal martingale measure. For the quadratic hedging we consider a martingale and semimartingale setting as in Hubalek et al. (2006). In this talk we will show the convergence of the option price, the delta and quadratic hedging strategies for epsilon tending to zero. Thus, it is justified to use the approximation and facilitate numerical experiments with it. We will also present convergence rates.

 

Capital projection for counterparty credit risk

by Prof. Ralf Werner

Nowadays, risk capital has become a scarce resource in banking business. This is especially true for uncollateralized OTC derivatives which are not cleared via a central counterpart. In this case it is important to understand the future capital need for these types of transactions, both in terms of regulatory and economic capital. For this purpose, we present a methodology how to efficiently project regulatory and economic capital over the life time of a transaction. It is shown how this can be exploited for risk management and RoRaC optimization.

 

Optimal dual martingales and new algorithms for Bermudan products

by Dr. Jianing Zhang

In this talk we introduce and study the concept of optimal and surely optimal dual martingales in the context of dual valuation of Bermudan options, and outline the development of new algorithms in this context. We provide a characterization theorem, a theorem which gives conditions for a martingale to be surely optimal, and a stability theorem concerning martingales which are near to be surely optimal in a sense. Guided by these results we develop a framework of backward algorithms for constructing such a martingale which can be utilized for computing an upper bound of the Bermudan product. The methodology is purely dual in the sense that it doesn't require certain input approximations to the Snell envelope. In an Itô-Lévy environment we outline a particular regression based backward algorithm which allows for computing dual upper bounds without nested Monte Carlo simulation. Moreover, as a by-product this algorithm also provides approximations to the continuation values of the product, which in turn determine a stopping policy. We hence obtain lower bounds at the same time. We supplement our presentation with several benchmark numerical experiments. This is a joint work with John Schoenmakers (Weierstrass Institute Berlin) and Junbo Huang (Galaxy Asset Management).

  

A hot-potato game under transient price impact and some effects of a transaction tax

by Tao Zhang

Building on observations by Schöneborn (2008), we consider a Nash equilibrium between two high-frequency traders in a simple market impact model with transient price impact and additional quadratic transaction costs. We show that for small transaction costs the high-frequency traders engage in a “hot-potato game”, in which the same asset position is sold back and forth. We then identify a critical value for the size of the transaction costs above which all oscillations disappear and strategies become buy-only or sell-only. Numerical simulations show that for both traders the expected costs can be lower with transaction costs than without. Moreover, the costs can increase with the trading frequency when there are no transaction costs, but decrease with the trading frequency when transaction costs are sufficiently high. We argue that these e↵ects occur due to the need of protection against predatory trading in the regime of low transaction costs.