Unnamed: 0
int64 0
41k
| title
stringlengths 4
274
| category
stringlengths 5
18
| summary
stringlengths 22
3.66k
| theme
stringclasses 8
values |
---|---|---|---|---|
4,200 |
A Stochastic Processes Toolkit for Risk Management
|
q-fin.RM
|
In risk management it is desirable to grasp the essential statistical
features of a time series representing a risk factor. This tutorial aims to
introduce a number of different stochastic processes that can help in grasping
the essential features of risk factors describing different asset classes or
behaviors. This paper does not aim at being exhaustive, but gives examples and
a feeling for practically implementable models allowing for stylised features
in the data. The reader may also use these models as building blocks to build
more complex models, although for a number of risk management applications the
models developed here suffice for the first step in the quantitative analysis.
The broad qualitative features addressed here are {fat tails} and {mean
reversion}. We give some orientation on the initial choice of a suitable
stochastic process and then explain how the process parameters can be estimated
based on historical data. Once the process has been calibrated, typically
through maximum likelihood estimation, one may simulate the risk factor and
build future scenarios for the risky portfolio. On the terminal simulated
distribution of the portfolio one may then single out several risk measures,
although here we focus on the stochastic processes estimation preceding the
simulation of the risk factors Finally, this first survey report focuses on
single time series. Correlation or more generally dependence across risk
factors, leading to multivariate processes modeling, will be addressed in
future work.
|
finance
|
4,201 |
Partial Equilibria with Convex Capital Requirements: Existence, Uniqueness and Stability
|
q-fin.RM
|
In an incomplete semimartingale model of a financial market, we consider
several risk-averse financial agents who negotiate the price of a bundle of
contingent claims. Assuming that the agents' risk preferences are modelled by
convex capital requirements, we define and analyze their demand functions and
propose a notion of a partial equilibrium price. In addition to sufficient
conditions for the existence and uniqueness, we also show that the equilibrium
prices are stable with respect to misspecifications of agents' risk
preferences.
|
finance
|
4,202 |
Monitoring dates of maximal risk
|
q-fin.RM
|
Monitoring means to observe a system for any changes which may occur over
time, using a monitor or measuring device of some sort. In this paper we
formulate a problem of monitoring dates of maximal risk of a financial
position. Thus, the systems we are going to observe arise from situations in
finance. The measuring device we are going to use is a time-consistent measure
of risk.
In the first part of the paper we discuss the numerical representation of
conditional convex risk measures which are defined in a space Lp(F,R) and take
values in L1(G,R). This will allow us to consider time-consistent convex risk
measures in L1(R).
In the second part of the paper we use a time-consistent convex risk measure
in order to define an abstract problem of monitoring stopping times of maximal
risk. The penalty function involved in the robust representation changes
qualitatively the time when maximal risk is for the first time identified. A
phenomenon which we discuss from the point of view of robust statistics.
|
finance
|
4,203 |
The Structural Modelling of Operational Risk via Bayesian inference: Combining Loss Data with Expert Opinions
|
q-fin.RM
|
To meet the Basel II regulatory requirements for the Advanced Measurement
Approaches, the bank's internal model must include the use of internal data,
relevant external data, scenario analysis and factors reflecting the business
environment and internal control systems. Quantification of operational risk
cannot be based only on historical data but should involve scenario analysis.
Historical internal operational risk loss data have limited ability to predict
future behaviour moreover, banks do not have enough internal data to estimate
low frequency high impact events adequately. Historical external data are
difficult to use due to different volumes and other factors. In addition,
internal and external data have a survival bias, since typically one does not
have data of all collapsed companies. The idea of scenario analysis is to
estimate frequency and severity of risk events via expert opinions taking into
account bank environment factors with reference to events that have occurred
(or may have occurred) in other banks. Scenario analysis is forward looking and
can reflect changes in the banking environment. It is important to not only
quantify the operational risk capital but also provide incentives to business
units to improve their risk management policies, which can be accomplished
through scenario analysis. By itself, scenario analysis is very subjective but
combined with loss data it is a powerful tool to estimate operational risk
losses. Bayesian inference is a statistical technique well suited for combining
expert opinions and historical data. In this paper, we present examples of the
Bayesian inference methods for operational risk quantification.
|
finance
|
4,204 |
Estimation of Operational Risk Capital Charge under Parameter Uncertainty
|
q-fin.RM
|
Many banks adopt the Loss Distribution Approach to quantify the operational
risk capital charge under Basel II requirements. It is common practice to
estimate the capital charge using the 0.999 quantile of the annual loss
distribution, calculated using point estimators of the frequency and severity
distribution parameters. The uncertainty of the parameter estimates is
typically ignored. One of the unpleasant consequences for the banks accounting
for parameter uncertainty is an increase in the capital requirement. This paper
demonstrates how the parameter uncertainty can be taken into account using a
Bayesian framework that also allows for incorporation of expert opinions and
external data into the estimation procedure.
|
finance
|
4,205 |
A "Toy" Model for Operational Risk Quantification using Credibility Theory
|
q-fin.RM
|
To meet the Basel II regulatory requirements for the Advanced Measurement
Approaches in operational risk, the bank's internal model should make use of
the internal data, relevant external data, scenario analysis and factors
reflecting the business environment and internal control systems. One of the
unresolved challenges in operational risk is combining of these data sources
appropriately. In this paper we focus on quantification of the low frequency
high impact losses exceeding some high threshold. We suggest a full credibility
theory approach to estimate frequency and severity distributions of these
losses by taking into account bank internal data, expert opinions and industry
data.
|
finance
|
4,206 |
Implementing Loss Distribution Approach for Operational Risk
|
q-fin.RM
|
To quantify the operational risk capital charge under the current regulatory
framework for banking supervision, referred to as Basel II, many banks adopt
the Loss Distribution Approach. There are many modeling issues that should be
resolved to use the approach in practice. In this paper we review the
quantitative methods suggested in literature for implementation of the
approach. In particular, the use of the Bayesian inference method that allows
to take expert judgement and parameter uncertainty into account, modeling
dependence and inclusion of insurance are discussed.
|
finance
|
4,207 |
Collective firm bankruptcies and phase transition in rating dynamics
|
q-fin.RM
|
We present a simple model of firm rating evolution. We consider two sources
of defaults: individual dynamics of economic development and Potts-like
interactions between firms. We show that such a defined model leads to phase
transition, which results in collective defaults. The existence of the
collective phase depends on the mean interaction strength. For small
interaction strength parameters, there are many independent bankruptcies of
individual companies. For large parameters, there are giant collective defaults
of firm clusters. In the case when the individual firm dynamics favors dumping
of rating changes, there is an optimal strength of the firm's interactions from
the systemic risk point of view.
|
finance
|
4,208 |
Conditional Value-at-Risk Constraint and Loss Aversion Utility Functions
|
q-fin.RM
|
We provide an economic interpretation of the practice consisting in
incorporating risk measures as constraints in a classic expected return
maximization problem. For what we call the infimum of expectations class of
risk measures, we show that if the decision maker (DM) maximizes the
expectation of a random return under constraint that the risk measure is
bounded above, he then behaves as a ``generalized expected utility maximizer''
in the following sense. The DM exhibits ambiguity with respect to a family of
utility functions defined on a larger set of decisions than the original one;
he adopts pessimism and performs first a minimization of expected utility over
this family, then performs a maximization over a new decisions set. This
economic behaviour is called ``Maxmin under risk'' and studied by Maccheroni
(2002). This economic interpretation allows us to exhibit a loss aversion
factor when the risk measure is the Conditional Value-at-Risk.
|
finance
|
4,209 |
A Bayesian Networks Approach to Operational Risk
|
q-fin.RM
|
A system for Operational Risk management based on the computational paradigm
of Bayesian Networks is presented. The algorithm allows the construction of a
Bayesian Network targeted for each bank using only internal loss data, and
takes into account in a simple and realistic way the correlations among
different processes of the bank. The internal losses are averaged over a
variable time horizon, so that the correlations at different times are removed,
while the correlations at the same time are kept: the averaged losses are thus
suitable to perform the learning of the network topology and parameters. The
algorithm has been validated on synthetic time series. It should be stressed
that the practical implementation of the proposed algorithm has a small impact
on the organizational structure of a bank and requires an investment in human
resources limited to the computational area.
|
finance
|
4,210 |
Preferences Yielding the "Precautionary Effect"
|
q-fin.RM
|
Consider an agent taking two successive decisions to maximize his expected
utility under uncertainty. After his first decision, a signal is revealed that
provides information about the state of nature. The observation of the signal
allows the decision-maker to revise his prior and the second decision is taken
accordingly. Assuming that the first decision is a scalar representing
consumption, the \emph{precautionary effect} holds when initial consumption is
less in the prospect of future information than without (no signal).
\citeauthor{Epstein1980:decision} in \citep*{Epstein1980:decision} has provided
the most operative tool to exhibit the precautionary effect. Epstein's Theorem
holds true when the difference of two convex functions is either convex or
concave, which is not a straightforward property, and which is difficult to
connect to the primitives of the economic model. Our main contribution consists
in giving a geometric characterization of when the difference of two convex
functions is convex, then in relating this to the primitive utility model. With
this tool, we are able to study and unite a large body of the literature on the
precautionary effect.
|
finance
|
4,211 |
Les Générateurs de Scénarios Économiques : quelle utilisation en assurance?
|
q-fin.RM
|
In this paper, we present the principal components of an economic scenario
generator (ESG), both for the theoretical design and for practical
implementation. The choice of these components should be linked to the ultimate
vocation of the economic scenario generator, which can be either a tool for
pricing financial products or a tool for projection and risk management. We
then develop a study on some performance measure indicators of the ESG as an
input for the decision-making process, namely the indicators of stability and
bias absence. Finally, a numerical application illustrates the main ideas of
the paper.
|
finance
|
4,212 |
Allocation d'actifs selon le critère de maximisation des fonds propres économiques en assurance non-vie
|
q-fin.RM
|
The economic equities maximization criterion (MFPE) leads to the choice of
financial portfolio, which maximizes the ratio of the expected value of the
insurance company on the capital. This criterion is presented in the framework
of a non-life insurance company and is applied within the framework of the
French legislation and in a lawful context inspired of the works in progress
about the European project Solvency 2. In the French regulation case, the
required solvency margin does not depend of the asset allocation. It is quite
different in the Solvency 2 framework because the target capital has to control
the global risk of the company. And the financial risk takes part of this
global risk. Thus the economic equities maximization criterion leads to search
a couple asset allocation / equities which solves a stochastic program. A
numerical illustration makes it possible to analyze the consequences of the
introduction of a Solvency 2 framework on the technical reserves and the
equities of a non-life insurance company and on the optimal allocation due to
the economic equities maximization criterion. Finally, the impact of a
misspecification of the risky asset model on the optimal allocation is
illustrated.
|
finance
|
4,213 |
Mesure des risques de marché et de souscription vie en situation d'information incomplète pour un portefeuille de prévoyance
|
q-fin.RM
|
In the framework of Embedded Value new standards, namely the MCEV norms, the
latest principles published in June 2008 address the issue of market and
underwriting risks measurement by using stochastic models of projection and
valorization. Knowing that stochastic models particularly data-consuming, the
question which can arise is the treatment of insurance portfolios only
available in aggregate data or portfolios in situation of incomplete
information. The aim of this article is to propose a pragmatic modeling of
these risks tied up with death covers of individual protection products in
these situations.
|
finance
|
4,214 |
Optimal Reversible Annuities to Minimize the Probability of Lifetime Ruin
|
q-fin.RM
|
We find the minimum probability of lifetime ruin of an investor who can
invest in a market with a risky and a riskless asset and who can purchase a
reversible life annuity. The surrender charge of a life annuity is a proportion
of its value. Ruin occurs when the total of the value of the risky and riskless
assets and the surrender value of the life annuity reaches zero. We find the
optimal investment strategy and optimal annuity purchase and surrender
strategies in two situations: (i) the value of the risky and riskless assets is
allowed to be negative, with the imputed surrender value of the life annuity
keeping the total positive; or (ii) the value of the risky and riskless assets
is required to be non-negative. In the first case, although the individual has
the flexiblity to buy or sell at any time, we find that the individual will not
buy a life annuity unless she can cover all her consumption via the annuity and
she will never sell her annuity. In the second case, the individual surrenders
just enough annuity income to keep her total assets positive. However, in this
second case, the individual's annuity purchasing strategy depends on the size
of the proportional surrender charge. When the charge is large enough, the
individual will not buy a life annuity unless she can cover all her
consumption, the so-called safe level. When the charge is small enough, the
individual will buy a life annuity at a wealth lower than this safe level.
|
finance
|
4,215 |
The two defaults scenario for stressing credit portfolio loss distributions
|
q-fin.RM
|
The impact of a stress scenario of default events on the loss distribution of
a credit portfolio can be assessed by determining the loss distribution
conditional on these events. While it is conceptually easy to estimate loss
distributions conditional on default events by means of Monte Carlo simulation,
it becomes impractical for two or more simultaneous defaults as then the
conditioning event is extremely rare. We provide an analytical approach to the
calculation of the conditional loss distribution for the CreditRisk+ portfolio
model with independent random loss given default distributions. The analytical
solution for this case can be used to check the accuracy of an approximation to
the conditional loss distribution whereby the unconditional model is run with
stressed input probabilities of default (PDs). It turns out that this
approximation is unbiased. Numerical examples, however, suggest that the
approximation may be seriously inaccurate but that the inaccuracy leads to
overestimation of tail losses and hence the approach errs on the conservative
side.
|
finance
|
4,216 |
Tracking errors from discrete hedging in exponential Lévy models
|
q-fin.RM
|
We analyze the errors arising from discrete readjustment of the hedging
portfolio when hedging options in exponential Levy models, and establish the
rate at which the expected squared error goes to zero when the readjustment
frequency increases. We compare the quadratic hedging strategy with the common
market practice of delta hedging, and show that for discontinuous option
pay-offs the latter strategy may suffer from very large discretization errors.
For options with discontinuous pay-offs, the convergence rate depends on the
underlying Levy process, and we give an explicit relation between the rate and
the Blumenthal-Getoor index of the process.
|
finance
|
4,217 |
Multivariate heavy-tailed models for Value-at-Risk estimation
|
q-fin.RM
|
For purposes of Value-at-Risk estimation, we consider several multivariate
families of heavy-tailed distributions, which can be seen as multidimensional
versions of Paretian stable and Student's t distributions allowing different
marginals to have different tail thickness. After a discussion of relevant
estimation and simulation issues, we conduct a backtesting study on a set of
portfolios containing derivative instruments, using historical US stock price
data.
|
finance
|
4,218 |
Recent progress in random metric theory and its applications to conditional risk measures
|
q-fin.RM
|
The purpose of this paper is to give a selective survey on recent progress in
random metric theory and its applications to conditional risk measures. This
paper includes eight sections. Section 1 is a longer introduction, which gives
a brief introduction to random metric theory, risk measures and conditional
risk measures. Section 2 gives the central framework in random metric theory,
topological structures, important examples, the notions of a random conjugate
space and the Hahn-Banach theorems for random linear functionals. Section 3
gives several important representation theorems for random conjugate spaces.
Section 4 gives characterizations for a complete random normed module to be
random reflexive. Section 5 gives hyperplane separation theorems currently
available in random locally convex modules. Section 6 gives the theory of
random duality with respect to the locally $L^{0}-$convex topology and in
particular a characterization for a locally $L^{0}-$convex module to be
$L^{0}-$pre$-$barreled. Section 7 gives some basic results on $L^{0}-$convex
analysis together with some applications to conditional risk measures. Finally,
Section 8 is devoted to extensions of conditional convex risk measures, which
shows that every representable $L^{\infty}-$type of conditional convex risk
measure and every continuous $L^{p}-$type of convex conditional risk measure
($1\leq p<+\infty$) can be extended to an $L^{\infty}_{\cal F}({\cal E})-$type
of $\sigma_{\epsilon,\lambda}(L^{\infty}_{\cal F}({\cal E}), L^{1}_{\cal
F}({\cal E}))-$lower semicontinuous conditional convex risk measure and an
$L^{p}_{\cal F}({\cal E})-$type of ${\cal T}_{\epsilon,\lambda}-$continuous
conditional convex risk measure ($1\leq p<+\infty$), respectively.
|
finance
|
4,219 |
A Loan Portfolio Model Subject to Random Liabilities and Systemic Jump Risk
|
q-fin.RM
|
We extend the Vasi\v{c}ek loan portfolio model to a setting where liabilities
fluctuate randomly and asset values may be subject to systemic jump risk. We
derive the probability distribution of the percentage loss of a uniform
portfolio and analyze its properties. We find that the impact of liability risk
is ambiguous and depends on the correlation between the continuous aggregate
factor and the asset-liability ratio as well as on the default intensity. We
also find that systemic jump risk has a significant impact on the upper
percentiles of the loss distribution and, therefore, on both the VaR-measure as
well as on the expected shortfall.
|
finance
|
4,220 |
Alarm System for Insurance Companies: A Strategy for Capital Allocation
|
q-fin.RM
|
One possible way of risk management for an insurance company is to develop an
early and appropriate alarm system before the possible ruin. The ruin is
defined through the status of the aggregate risk process, which in turn is
determined by premium accumulation as well as claim settlement outgo for the
insurance company. The main purpose of this work is to design an effective
alarm system, i.e. to define alarm times and to recommend augmentation of
capital of suitable magnitude at those points to prevent or reduce the chance
of ruin. To draw a fair measure of effectiveness of alarm system, comparison is
drawn between an alarm system, with capital being added at the sound of every
alarm, and the corresponding system without any alarm, but an equivalently
higher initial capital. Analytical results are obtained in general setup and
this is backed up by simulated performances with various types of loss severity
distributions. This provides a strategy for suitably spreading out the capital
and yet addressing survivability concerns at satisfactory level.
|
finance
|
4,221 |
A Dynamical Model for Forecasting Operational Losses
|
q-fin.RM
|
A novel dynamical model for the study of operational risk in banks and
suitable for the calculation of the Value at Risk (VaR) is proposed. The
equation of motion takes into account the interactions among different bank's
processes, the spontaneous generation of losses via a noise term and the
efforts made by the bank to avoid their occurrence. Since the model is very
general, it can be tailored on the internal organizational structure of a
specific bank by estimating some of its parameters from historical operational
losses. The model is exactly solved in the case in which there are no causal
loops in the matrix of couplings and it is shown how the solution can be
exploited to estimate also the parameters of the noise. The forecasting power
of the model is investigated by using a fraction $f$ of simulated data to
estimate the parameters, showing that for $f = 0.75$ the VaR can be forecast
with an error $\simeq 10^{-3}$.
|
finance
|
4,222 |
Target market risk evaluation
|
q-fin.RM
|
After the shocking series of bankruptcies started in 2008, the public does
not trust anymore the classical methods of assessing business risks. The global
economic severe downturn caused demand for both developed and emerging
economies' exports to drop and the crisis became truly global. However, this
current crisis offers opportunities for those companies able to play well their
cards. Entering new markets has always been a hazardous entrepreneurial
attempt, but also a rewarding one, in the case of success. The paper presents a
new indicator meant for assessing the prospective of success or failure for a
company trying to enter a new market by using an associative strategy. In order
to take the right decision concerning the optimal market entry strategy,
marketers may use a software application, "AnBilant", created by a research
team from Hyperion University.
|
finance
|
4,223 |
Liquidity-adjusted Market Risk Measures with Stochastic Holding Period
|
q-fin.RM
|
Within the context of risk integration, we introduce in risk measurement
stochastic holding period (SHP) models. This is done in order to obtain a
`liquidity-adjusted risk measure' characterized by the absence of a fixed time
horizon. The underlying assumption is that - due to changes on market liquidity
conditions - one operates along an `operational time' to which the P&L process
of liquidating a market portfolio is referred. This framework leads to a
mixture of distributions for the portfolio returns, potentially allowing for
skewness, heavy tails and extreme scenarios. We analyze the impact of possible
distributional choices for the SHP. In a multivariate setting, we hint at the
possible introduction of dependent SHP processes, which potentially lead to non
linear dependence among the P&L processes and therefore to tail dependence
across assets in the portfolio, although this may require drastic choices on
the SHP distributions. We also find that increasing dependence as measured by
Kendall's tau through common SHP's appears to be unfeasible. We finally discuss
potential developments following future availability of market data.
|
finance
|
4,224 |
Capital allocation for credit portfolios under normal and stressed market conditions
|
q-fin.RM
|
If the probability of default parameters (PDs) fed as input into a credit
portfolio model are estimated as through-the-cycle (TTC) PDs stressed market
conditions have little impact on the results of the capital calculations
conducted with the model. At first glance, this is totally different if the PDs
are estimated as point-in-time (PIT) PDs. However, it can be argued that the
reflection of stressed market conditions in input PDs should correspond to the
use of reduced correlation parameters or even the removal of correlations in
the model. Additionally, the confidence levels applied for the capital
calculations might be made reflective of the changing market conditions. We
investigate the interplay of PIT PDs, correlations, and confidence levels in a
credit portfolio model in more detail and analyse possible designs of
capital-levelling policies. Our findings may of interest to banks that want to
combine their approaches to capital measurement and allocation with active
portfolio management that, by its nature, needs to be reflective of current
market conditions.
|
finance
|
4,225 |
Quantile hedging for basket derivatives
|
q-fin.RM
|
The problem of quantile hedging for basket derivatives in the Black-Scholes
model with correlation is considered. Explicit formulas for the probability
maximizing function and the cost reduction function are derived. Applicability
of the results for the widely traded derivatives as digital, quantos,
outperformance and spread options is shown.
|
finance
|
4,226 |
Markov chain Monte Carlo estimation of default and recovery: dependent via the latent systematic factor
|
q-fin.RM
|
It is a well known fact that recovery rates tend to go down when the number
of defaults goes up in economic downturns. We demonstrate how the loss given
default model with the default and recovery dependent via the latent systematic
risk factor can be estimated using Bayesian inference methodology and Markov
chain Monte Carlo method. This approach is very convenient for joint estimation
of all model parameters and latent systematic factors. Moreover, all relevant
uncertainties are easily quantified. Typically available data are annual
averages of defaults and recoveries and thus the datasets are small and
parameter uncertainty is significant. In this case Bayesian approach is
superior to the maximum likelihood method that relies on a large sample limit
Gaussian approximation for the parameter uncertainty. As an example, we
consider a homogeneous portfolio with one latent factor. However, the approach
can be easily extended to deal with non-homogenous portfolios and several
latent factors.
|
finance
|
4,227 |
Set-valued risk measures for conical market models
|
q-fin.RM
|
Set-valued risk measures on $L^p_d$ with $0 \leq p \leq \infty$ for conical
market models are defined, primal and dual representation results are given.
The collection of initial endowments which allow to super-hedge a multivariate
claim are shown to form the values of a set-valued sublinear (coherent) risk
measure. Scalar risk measures with multiple eligible assets also turn out to be
a special case within the set-valued framework.
|
finance
|
4,228 |
An Active Margin System and its Application in Chinese Margin Lending Market
|
q-fin.RM
|
In order to protect brokers from customer defaults in a volatile market, an
active margin system is proposed for the transactions of margin lending in
China. The probability of negative return under the condition that collaterals
are liquidated in a falling market is used to measure the risk associated with
margin loans, and a recursive algorithm is proposed to calculate this
probability under a Markov chain model. The optimal maintenance margin ratio
can be given under the constraint of the proposed risk measurement for a
specified amount of initial margin. An example of such a margin system is
constructed and applied to $26,800$ margin loans of 134 stocks traded on the
Shanghai Stock Exchange. The empirical results indicate that the proposed
method is an operational method for brokers to set margin system with a clearly
specified target of risk control.
|
finance
|
4,229 |
Dependence of defaults and recoveries in structural credit risk models
|
q-fin.RM
|
The current research on credit risk is primarily focused on modeling default
probabilities. Recovery rates are often treated as an afterthought; they are
modeled independently, in many cases they are even assumed constant. This is
despite of their pronounced effect on the tail of the loss distribution. Here,
we take a step back, historically, and start again from the Merton model, where
defaults and recoveries are both determined by an underlying process. Hence,
they are intrinsically connected. For the diffusion process, we can derive the
functional relation between expected recovery rate and default probability.
This relation depends on a single parameter only. In Monte Carlo simulations we
find that the same functional dependence also holds for jump-diffusion and
GARCH processes. We discuss how to incorporate this structural recovery rate
into reduced form models, in order to restore essential structural information
which is usually neglected in the reduced-form approach.
|
finance
|
4,230 |
A Random Matrix Approach to Credit Risk
|
q-fin.RM
|
We estimate generic statistical properties of a structural credit risk model
by considering an ensemble of correlation matrices. This ensemble is set up by
Random Matrix Theory. We demonstrate analytically that the presence of
correlations severely limits the effect of diversification in a credit
portfolio if the correlations are not identically zero. The existence of
correlations alters the tails of the loss distribution considerably, even if
their average is zero. Under the assumption of randomly fluctuating
correlations, a lower bound for the estimation of the loss distribution is
provided.
|
finance
|
4,231 |
Portfolio Insurance under a risk-measure constraint
|
q-fin.RM
|
We study the problem of portfolio insurance from the point of view of a fund
manager, who guarantees to the investor that the portfolio value at maturity
will be above a fixed threshold. If, at maturity, the portfolio value is below
the guaranteed level, a third party will refund the investor up to the
guarantee. In exchange for this protection, the third party imposes a limit on
the risk exposure of the fund manager, in the form of a convex monetary risk
measure. The fund manager therefore tries to maximize the investor's utility
function subject to the risk measure constraint.We give a full solution to this
nonconvex optimization problem in the complete market setting and show in
particular that the choice of the risk measure is crucial for the optimal
portfolio to exist. Explicit results are provided for the entropic risk measure
(for which the optimal portfolio always exists) and for the class of spectral
risk measures (for which the optimal portfolio may fail to exist in some
cases).
|
finance
|
4,232 |
Calibration of structural and reduced-form recovery models
|
q-fin.RM
|
In recent years research on credit risk modelling has mainly focused on
default probabilities. Recovery rates are usually modelled independently, quite
often they are even assumed constant. Then, however, the structural connection
between recovery rates and default probabilities is lost and the tails of the
loss distribution can be underestimated considerably. The problem of
underestimating tail losses becomes even more severe, when calibration issues
are taken into account. To demonstrate this we choose a Merton-type structural
model as our reference system. Diffusion and jump-diffusion are considered as
underlying processes. We run Monte Carlo simulations of this model and
calibrate different recovery models to the simulation data. For simplicity, we
take the default probabilities directly from the simulation data. We compare a
reduced-form model for recoveries with a constant recovery approach. In
addition, we consider a functional dependence between recovery rates and
default probabilities. This dependence can be derived analytically for the
diffusion case. We find that the constant recovery approach drastically and
systematically underestimates the tail of the loss distribution. The
reduced-form recovery model shows better results, when all simulation data is
used for calibration. However, if we restrict the simulation data used for
calibration, the results for the reduced-form model deteriorate. We find the
most reliable and stable results, when we make use of the functional dependence
between recovery rates and default probabilities.
|
finance
|
4,233 |
The dynamics of financial stability in complex networks
|
q-fin.RM
|
We address the problem of banking system resilience by applying
off-equilibrium statistical physics to a system of particles, representing the
economic agents, modelled according to the theoretical foundation of the
current banking regulation, the so called Merton-Vasicek model. Economic agents
are attracted to each other to exchange `economic energy', forming a network of
trades. When the capital level of one economic agent drops below a minimum, the
economic agent becomes insolvent. The insolvency of one single economic agent
affects the economic energy of all its neighbours which thus become susceptible
to insolvency, being able to trigger a chain of insolvencies (avalanche). We
show that the distribution of avalanche sizes follows a power-law whose
exponent depends on the minimum capital level. Furthermore, we present evidence
that under an increase in the minimum capital level, large crashes will be
avoided only if one assumes that agents will accept a drop in business levels,
while keeping their trading attitudes and policies unchanged. The alternative
assumption, that agents will try to restore their business levels, may lead to
the unexpected consequence that large crises occur with higher probability.
|
finance
|
4,234 |
Spectral Risk Measures: Properties and Limitations
|
q-fin.RM
|
Spectral risk measures (SRMs) are risk measures that take account of user
riskaversion, but to date there has been little guidance on the choice of
utility function underlying them. This paper addresses this issue by examining
alternative approaches based on exponential and power utility functions. A
number of problems are identified with both types of spectral risk measure. The
general lesson is that users of spectral risk measures must be careful to
select utility functions that fit the features of the particular problems they
are dealing with, and should be especially careful when using power SRMs.
|
finance
|
4,235 |
Extreme Measures of Agricultural Financial Risk
|
q-fin.RM
|
Risk is an inherent feature of agricultural production and marketing and
accurate measurement of it helps inform more efficient use of resources. This
paper examines three tail quantile-based risk measures applied to the
estimation of extreme agricultural financial risk for corn and soybean
production in the US: Value at Risk (VaR), Expected Shortfall (ES) and Spectral
Risk Measures (SRMs). We use Extreme Value Theory (EVT) to model the tail
returns and present results for these three different risk measures using
agricultural futures market data. We compare the estimated risk measures in
terms of their size and precision, and find that they are all considerably
higher than normal estimates; they are also quite uncertain, and become more
uncertain as the risks involved become more extreme.
|
finance
|
4,236 |
Concave Distortion Semigroups
|
q-fin.RM
|
The problem behind this paper is the proper measurement of the degree of
quality/acceptability/distance to arbitrage of trades. We are narrowing the
class of coherent acceptability indices introduced by Cherny and Madan (2007)
by imposing an additional mathematical property. For this, we introduce the
notion of a concave distortion semigroup as a family $(\Psi_t)_{t\ge0}$ of
concave increasing functions $[0,1]\to[0,1]$ satisfying the semigroup property
$$ \Psi_s\circ\Psi_t=\Psi_{s+t},\quad s,t\ge0. $$ The goal of the paper is the
investigation of these semigroups with regard to the following aspects:
representation of distortion semigroups; properties of distortion semigroups
desirable from the economical or mathematical perspective; determining which
concave distortions belong to some distortion semigroup.
|
finance
|
4,237 |
Banking retail consumer finance data generator - credit scoring data repository
|
q-fin.RM
|
This paper presents two cases of random banking data generators based on
migration matrices and scoring rules. The banking data generator is a new hope
in researches of finding the proving method of comparisons of various credit
scoring techniques. There is analyzed the influence of one cyclic
macro--economic variable on stability in the time account and client
characteristics. Data are very useful for various analyses to understand in the
better way the complexity of the banking processes and also for students and
their researches. There are presented very interesting conclusions for crisis
behavior, namely that if a crisis is impacted by many factors, both customer
characteristics: application and behavioral; then there is very difficult to
indicate these factors in the typical scoring analysis and the crisis is
everywhere, in every kind of risk reports.
|
finance
|
4,238 |
A Stochastic Model for the Analysis of Demographic Risk in Pay-As-You-Go Pension Funds
|
q-fin.RM
|
This research presents an analysis of the demographic risk related to future
membership patterns in pension funds with restricted entrance, financed under a
pay-as-you-go scheme. The paper, therefore, proposes a stochastic model for
investigating the behaviour of the demographic variable "new entrants" and the
influence it exerts on the financial dynamics of such funds. Further
information on pension funds of Italian professional categories and an
application to the Cassa Nazionale di Previdenza e Assistenza dei Dottori
Commercialisti (CNPADC) are then provided.
|
finance
|
4,239 |
One-year reserve risk including a tail factor: closed formula and bootstrap approaches
|
q-fin.RM
|
In this paper, we detail the main simulation methods used in practice to
measure one-year reserve risk, and describe the bootstrap method providing an
empirical distribution of the Claims Development Result (CDR) whose variance is
identical to the closed-form expression of the prediction error proposed by
W\"uthrich et al. (2008). In particular, we integrate the stochastic modeling
of a tail factor in the bootstrap procedure. We demonstrate the equivalence
with existing analytical results and develop closed-form expressions for the
error of prediction including a tail factor. A numerical example is given at
the end of this study.
|
finance
|
4,240 |
Quantifying mortality risk in small defined-benefit pension schemes
|
q-fin.RM
|
A risk of small defined-benefit pension schemes is that there are too few
members to eliminate idiosyncratic mortality risk, that is there are too few
members to effectively pool mortality risk. This means that when there are few
members in the scheme, there is an increased risk of the liability value
deviating significantly from the expected liability value, as compared to a
large scheme.
We quantify this risk through examining the coefficient of variation of a
scheme's liability value relative to its expected value. We examine how the
coefficient of variation varies with the number of members and find that, even
with a few hundred members in the scheme, idiosyncratic mortality risk may
still be significant. Using a stochastic mortality model reduces the
idiosyncratic mortality risk but at the cost of increasing the overall
mortality risk in the scheme.
Next we quantify the amount of the mortality risk concentrated in the
executive section of the scheme, where the executives receive a benefit that is
higher than the non-executive benefit. We use the Euler capital allocation
principle to allocate the total standard deviation of the liability value
between the executive and non-executive sections. We find that the proportion
of the standard deviation allocated to the executive section is higher than is
suggested by an allocation based on the members' benefit amounts. While the
results are sensitive to the choice of mortality model, they do suggest that
the mortality risk of the scheme should be monitored and managed within the
sections of a scheme, and not only on a scheme-wide basis.
|
finance
|
4,241 |
Losing money with a high Sharpe ratio
|
q-fin.RM
|
A simple example shows that losing all money is compatible with a very high
Sharpe ratio (as computed after losing all money). However, the only way that
the Sharpe ratio can be high while losing money is that there is a period in
which all or almost all money is lost. This note explores the best achievable
Sharpe and Sortino ratios for investors who lose money but whose one-period
returns are bounded below (or both below and above) by a known constant.
|
finance
|
4,242 |
Hedging strategies with a put option and their failure rates
|
q-fin.RM
|
The problem of stock hedging is reconsidered in this paper, where a put
option is chosen from a set of available put options to hedge the market risk
of a stock. A formula is proposed to determine the probability that the
potential loss exceeds a predetermined level of Value-at-Risk, which is used to
find the optimal strike price and optimal hedge ratio. The assumptions that the
chosen put option finishes in-the-money and the constraint of hedging budget is
binding are relaxed in this paper. A hypothesis test is proposed to determine
whether the failure rate of hedging strategy is greater than the predetermined
level of risk. The performances of the proposed method and the method with
those two assumptions are compared through simulations. The results of
simulated investigations indicate that the proposed method is much more prudent
than the method with those two assumptions.
|
finance
|
4,243 |
Menger 1934 revisited
|
q-fin.RM
|
Karl Menger's 1934 paper on the St. Petersburg paradox contains mathematical
errors that invalidate his conclusion that unbounded utility functions,
specifically Bernoulli's logarithmic utility, fail to resolve modified versions
of the St. Petersburg paradox.
|
finance
|
4,244 |
Historical risk measures on stock market indices and energy markets
|
q-fin.RM
|
In this paper we look at the efficacy of different risk measures on energy
markets and across several different stock market indices. We use both the
Value at Risk and the Tail Conditional Expectation on each of these data sets.
We also consider several different durations and levels for historical risk
measures. Through our results we make some recommendations for a robust risk
management strategy that involves historical risk measures.
|
finance
|
4,245 |
A Mathematical Method for Deriving the Relative Effect of Serviceability on Default Risk
|
q-fin.RM
|
The writers propose a mathematical Method for deriving risk weights which
describe how a borrower's income, relative to their debt service obligations
(serviceability) affects the probability of default of the loan.
The Method considers the borrower's income not simply as a known quantity at
the time the loan is made, but as an uncertain quantity following a statistical
distribution at some later point in the life of the loan. This allows a
probability to be associated with an income level leading to default, so that
the relative risk associated with different serviceability levels can be
quantified. In a sense, the Method can be thought of as an extension of the
Merton Model to quantities that fail to satisfy Merton's 'critical' assumptions
relating to the efficient markets hypothesis.
A set of numerical examples of risk weights derived using the Method suggest
that serviceability may be under-represented as a risk factor in many mortgage
credit risk models.
|
finance
|
4,246 |
Restructuring Counterparty Credit Risk
|
q-fin.RM
|
We introduce an innovative theoretical framework to model derivative
transactions between defaultable entities based on the principle of arbitrage
freedom. Our framework extends the traditional formulations based on Credit and
Debit Valuation Adjustments (CVA and DVA). Depending on how the default
contingency is accounted for, we list a total of ten different structuring
styles. These include bipartite structures between a bank and a counterparty,
tri-partite structures with one margin lender in addition, quadri-partite
structures with two margin lenders and, most importantly, configurations where
all derivative transactions are cleared through a Central Counterparty (CCP).
We compare the various structuring styles under a number of criteria including
consistency from an accounting standpoint, counterparty risk hedgeability,
numerical complexity, transaction portability upon default, induced behaviour
and macro-economic impact of the implied wealth allocation.
|
finance
|
4,247 |
Bayesian estimation of probabilities of default for low default portfolios
|
q-fin.RM
|
The estimation of probabilities of default (PDs) for low default portfolios
by means of upper confidence bounds is a well established procedure in many
financial institutions. However, there are often discussions within the
institutions or between institutions and supervisors about which confidence
level to use for the estimation. The Bayesian estimator for the PD based on the
uninformed, uniform prior distribution is an obvious alternative that avoids
the choice of a confidence level. In this paper, we demonstrate that in the
case of independent default events the upper confidence bounds can be
represented as quantiles of a Bayesian posterior distribution based on a prior
that is slightly more conservative than the uninformed prior. We then describe
how to implement the uninformed and conservative Bayesian estimators in the
dependent one- and multi-period default data cases and compare their estimates
to the upper confidence bound estimates. The comparison leads us to suggest a
constrained version of the uninformed (neutral) Bayesian estimator as an
alternative to the upper confidence bound estimators.
|
finance
|
4,248 |
Real Output Costs of Financial Crises: A Loss Distribution Approach
|
q-fin.RM
|
We study cross-country GDP losses due to financial crises in terms of
frequency (number of loss events per period) and severity (loss per
occurrence). We perform the Loss Distribution Approach (LDA) to estimate a
multi-country aggregate GDP loss probability density function and the
percentiles associated to extreme events due to financial crises.
We find that output losses arising from financial crises are strongly
heterogeneous and that currency crises lead to smaller output losses than debt
and banking crises.
Extreme global financial crises episodes, occurring with a one percent
probability every five years, lead to losses between 2.95% and 4.54% of world
GDP.
|
finance
|
4,249 |
Time consistency of dynamic risk measures in markets with transaction costs
|
q-fin.RM
|
The paper concerns primal and dual representations as well as time
consistency of set-valued dynamic risk measures. Set-valued risk measures
appear naturally when markets with transaction costs are considered and capital
requirements can be made in a basket of currencies or assets. Time consistency
of scalar risk measures can be generalized to set-valued risk measures in
different ways. The most intuitive generalization is called time consistency.
We will show that the equivalence between a recursive form of the risk measure
and time consistency, which is a central result in the scalar case, does not
hold in the set-valued framework. Instead, we propose an alternative
generalization, which we will call multi-portfolio time consistency and show in
the main result of the paper that this property is indeed equivalent to the
recursive form as well as to an additive property for the acceptance sets.
Multi-portfolio time consistency is a stronger property than time consistency.
In the scalar case, both notions coincide.
|
finance
|
4,250 |
A Dynamical Approach to Operational Risk Measurement
|
q-fin.RM
|
We propose a dynamical model for the estimation of Operational Risk in
banking institutions. Operational Risk is the risk that a financial loss occurs
as the result of failed processes. Examples of operational losses are the ones
generated by internal frauds, human errors or failed transactions. In order to
encompass the most heterogeneous set of processes, in our approach the losses
of each process are generated by the interplay among random noise, interactions
with other processes and the efforts the bank makes to avoid losses. We show
how some relevant parameters of the model can be estimated from a database of
historical operational losses, validate the estimation procedure and test the
forecasting power of the model. Some advantages of our approach over the
traditional statistical techniques are that it allows to follow the whole time
evolution of the losses and to take into account different-time correlations
among the processes.
|
finance
|
4,251 |
Derivatives and Credit Contagion in Interconnected Networks
|
q-fin.RM
|
The importance of adequately modeling credit risk has once again been
highlighted in the recent financial crisis. Defaults tend to cluster around
times of economic stress due to poor macro-economic conditions, {\em but also}
by directly triggering each other through contagion. Although credit default
swaps have radically altered the dynamics of contagion for more than a decade,
models quantifying their impact on systemic risk are still missing. Here, we
examine contagion through credit default swaps in a stylized economic network
of corporates and financial institutions. We analyse such a system using a
stochastic setting, which allows us to exploit limit theorems to exactly solve
the contagion dynamics for the entire system. Our analysis shows that, by
creating additional contagion channels, CDS can actually lead to greater
instability of the entire network in times of economic stress. This is
particularly pronounced when CDS are used by banks to expand their loan books
(arguing that CDS would offload the additional risks from their balance
sheets). Thus, even with complete hedging through CDS, a significant loan book
expansion can lead to considerably enhanced probabilities for the occurrence of
very large losses and very high default rates in the system. Our approach adds
a new dimension to research on credit contagion, and could feed into a rational
underpinning of an improved regulatory framework for credit derivatives.
|
finance
|
4,252 |
Active margin system for margin loans and its application in Chinese market: using cash and randomly selected stock as collateral
|
q-fin.RM
|
An active margin system for margin loans is proposed for Chinese margin
lending market, which uses cash and randomly selected stock as collateral. The
conditional probability of negative return(CPNR) after a forced sale of
securities from under-margined account in a falling market is used to measure
the risk faced by the brokers, and the margin system is chosen under the
constraint of the risk measure. In order to calculate CPNR, a recursive
algorithm is proposed under a Markov chain model, which is constructed by
sample learning method. The resulted margin system is an active system, which
is able to adjust actively with respect to the changes of stock prices and the
changes of different collateral. The resulted margin system is applied to
30,000 margin loans of 150 stocks listed on Shanghai Stock Exchange. The
empirical results show the number of margin calls and the average costs of the
loans under the proposed margin system are less than their counterparts under
the system required by SSE and SZSE.
|
finance
|
4,253 |
Active margin system for margin loans using cash and stock as collateral and its application in Chinese market
|
q-fin.RM
|
Margin system for margin loans using cash and stock as collateral is
considered in this paper, which is the line of defence for brokers against risk
associated with margin trading. The conditional probability of negative return
is used as risk measure, and a recursive algorithm is proposed to realize this
measure under a Markov chain model. Optimal margin system is chosen from those
systems which satisfy the constraint of the risk measure. The resulted margin
system is able to adjust actively with respect to the changes of stock prices.
The margin system required by the Shanghai Stock Exchange is compared with the
proposed system, where 25,200 margin loans of 126 stocks listed on the SSE are
investigated. It is found that the number of margin calls under the proposed
margin system is significantly less than its counterpart under the required
system for the same level of risk, and the average costs of the loans are
similar under the two types of margin systems.
|
finance
|
4,254 |
Using Decision Tree Learner to Classify Solvency Position for Thai Non-life Insurance Companies
|
q-fin.RM
|
This paper introduces a Decision Tree Learner as an early warning system for
classification of the non-life insurance companies according to their financial
solid as strong, moderate, weak, or insolvency. In this study, we ran several
experiments to show that the proposed model can achieve a good result using
standard 10 fold crossvalidation, split train and test data set, and separated
test set. The results show that the method is effective and can accurately
classify the solvency position.
|
finance
|
4,255 |
Empirical Evidence for the Structural Recovery Model
|
q-fin.RM
|
While defaults are rare events, losses can be substantial even for credit
portfolios with a large number of contracts. Therefore, not only a good
evaluation of the probability of default is crucial, but also the severity of
losses needs to be estimated. The recovery rate is often modeled independently
with regard to the default probability, whereas the Merton model yields a
functional dependence of both variables. We use Moody's Default and Recovery
Database in order to investigate the relationship of default probability and
recovery rate for senior secured bonds. The assumptions in the Merton model do
not seem justified by the empirical situation. Yet the empirical dependence of
default probability and recovery rate is well described by the functional
dependence found in the Merton model.
|
finance
|
4,256 |
Ordinal Classification Method for the Evaluation Of Thai Non-life Insurance Companies
|
q-fin.RM
|
This paper proposes a use of an ordinal classifier to evaluate the financial
solidity of non-life insurance companies as strong, moderate, weak, and
insolvency. This study constructed an efficient classification model that can
be used by regulators to evaluate the financial solidity and to determine the
priority of further examination as an early warning system. The proposed model
is beneficial to policy-makers to create guidelines for the solvency
regulations and roles of the government in protecting the public against
insolvency.
|
finance
|
4,257 |
Systemic losses in banking networks: indirect interaction of nodes via asset prices
|
q-fin.RM
|
A simple banking network model is proposed which features multiple waves of
bank defaults and is analytically solvable in the limiting case of an
infinitely large homogeneous network. The model is a collection of nodes
representing individual banks; associated with each node is a balance sheet
consisting of assets and liabilities. Initial node failures are triggered by
external correlated shocks applied to the asset sides of the balance sheets.
These defaults lead to further reductions in asset values of all nodes which in
turn produce additional failures, and so on. This mechanism induces indirect
interactions between the nodes and leads to a cascade of defaults. There are no
interbank links, and therefore no direct interactions, between the nodes. The
resulting probability distribution for the total (direct plus systemic) network
loss can be viewed as a modification of the well-known Vasicek distribution.
|
finance
|
4,258 |
From Risk Measures to Research Measures
|
q-fin.RM
|
In order to evaluate the quality of the scientific research, we introduce a
new family of scientific performance measures, called Scientific Research
Measures (SRM). Our proposal originates from the more recent developments in
the theory of risk measures and is an attempt to resolve the many problems of
the existing bibliometric indices. The SRM that we introduce are based on the
whole scientist's citation record and are: coherent, as they share the same
structural properties; flexible to fit peculiarities of different areas and
seniorities; granular, as they allow a more precise comparison between
scientists, and inclusive, as they comprehend several popular indices. Another
key feature of our SRM is that they are planned to be calibrated to the
particular scientific community. We also propose a dual formulation of this
problem and explain its relevance in this context.
|
finance
|
4,259 |
Optimal retirement consumption with a stochastic force of mortality
|
q-fin.RM
|
We extend the lifecycle model (LCM) of consumption over a random horizon
(a.k.a. the Yaari model) to a world in which (i.) the force of mortality obeys
a diffusion process as opposed to being deterministic, and (ii.) a consumer can
adapt their consumption strategy to new information about their mortality rate
(a.k.a. health status) as it becomes available. In particular, we derive the
optimal consumption rate and focus on the impact of mortality rate uncertainty
vs. simple lifetime uncertainty -- assuming the actuarial survival curves are
initially identical -- in the retirement phase where this risk plays a greater
role.
In addition to deriving and numerically solving the PDE for the optimal
consumption rate, our main general result is that when utility preferences are
logarithmic the initial consumption rates are identical. But, in a CRRA
framework in which the coefficient of relative risk aversion is greater
(smaller) than one, the consumption rate is higher (lower) and a stochastic
force of mortality does make a difference.
That said, numerical experiments indicate that even for non-logarithmic
preferences, the stochastic mortality effect is relatively minor from the
individual's perspective. Our results should be relevant to researchers
interested in calibrating the lifecycle model as well as those who provide
normative guidance (a.k.a. financial advice) to retirees.
|
finance
|
4,260 |
A different perspective on retirement income sustainability: the blueprint for a ruin contingent life annuity (RCLA)
|
q-fin.RM
|
The purpose of this article is twofold. First, we motivate the need for a new
type of stand-alone retirement income insurance product that would help
individuals protect against personal longevity risk and possible "retirement
ruin" in an economically efficient manner. We label this product a
ruin-contingent life annuity (RCLA), which we elaborate-on and explain with
various numerical examples and a basic pricing model. Second, we argue that
with the proper perspective a similar product actually exists, albeit not
available on a stand-alone basis. Namely, they are fused and embedded within
modern variable annuity (VA) policies with guaranteed living income benefit
(GLiB) riders. Indeed, the popularity of GLiB riders on VA policies point
towards the potential commercial success of such a stand-alone vehicle.
|
finance
|
4,261 |
Beyond cash-additive risk measures: when changing the numéraire fails
|
q-fin.RM
|
We discuss risk measures representing the minimum amount of capital a
financial institution needs to raise and invest in a pre-specified eligible
asset to ensure it is adequately capitalized. Most of the literature has
focused on cash-additive risk measures, for which the eligible asset is a
risk-free bond, on the grounds that the general case can be reduced to the
cash-additive case by a change of numeraire. However, discounting does not work
in all financially relevant situations, typically when the eligible asset is a
defaultable bond. In this paper we fill this gap allowing for general eligible
assets. We provide a variety of finiteness and continuity results for the
corresponding risk measures and apply them to risk measures based on
Value-at-Risk and Tail Value-at-Risk on $L^p$ spaces, as well as to shortfall
risk measures on Orlicz spaces. We pay special attention to the property of
cash subadditivity, which has been recently proposed as an alternative to cash
additivity to deal with defaultable bonds. For important examples, we provide
characterizations of cash subadditivity and show that, when the eligible asset
is a defaultable bond, cash subadditivity is the exception rather than the
rule. Finally, we consider the situation where the eligible asset is not
liquidly traded and the pricing rule is no longer linear. We establish when the
resulting risk measures are quasiconvex and show that cash subadditivity is
only compatible with continuous pricing rules.
|
finance
|
4,262 |
Forecasting Value-at-Risk with Time-Varying Variance, Skewness and Kurtosis in an Exponential Weighted Moving Average Framework
|
q-fin.RM
|
This paper provides an insight to the time-varying dynamics of the shape of
the distribution of financial return series by proposing an exponential
weighted moving average model that jointly estimates volatility, skewness and
kurtosis over time using a modified form of the Gram-Charlier density in which
skewness and kurtosis appear directly in the functional form of this density.
In this setting VaR can be described as a function of the time-varying higher
moments by applying the Cornish-Fisher expansion series of the first four
moments. An evaluation of the predictive performance of the proposed model in
the estimation of 1-day and 10-day VaR forecasts is performed in comparison
with the historical simulation, filtered historical simulation and GARCH model.
The adequacy of the VaR forecasts is evaluated under the unconditional,
independence and conditional likelihood ratio tests as well as Basel II
regulatory tests. The results presented have significant implications for risk
management, trading and hedging activities as well as in the pricing of equity
derivatives.
|
finance
|
4,263 |
Interest Rate Risk of Bond Prices on Macedonian Stock Exchange - Empirical Test of the Duration, Modified Duration and Convexity and Bonds Valuation
|
q-fin.RM
|
This article presents valuation of Treasury Bonds (T-Bonds) on Macedonian
Stock Exchange (MSE) and empirical test of duration, modified duration and
convexity of the T-bonds at MSE in order to determine sensitivity of bonds
prices on interest rate changes. The main goal of this study is to determine
how standard valuation models fit in case of T- Bonds that are traded on MSE
and to verify whether they offer reliable results compared with average bonds
prices on MSE. We test the sensitivity of T- Bonds on MSE on interest rate
changes and determine that convexity is more accurate measure as approximation
of bond prices changes than duration. Final conclusion is that T-Bonds traded
at MSE are not sensitive on interest rate changes due to institutional
investors' permanent higher demand and at the same time market limited offer of
risk-free instruments.
|
finance
|
4,264 |
A Dynamical Model for Operational Risk in Banks
|
q-fin.RM
|
Operational risk is the risk relative to monetary losses caused by failures
of bank internal processes due to heterogeneous causes. A dynamical model
including both spontaneous generation of losses and generation via interactions
between different processes is presented; the efforts made by the bank to avoid
the occurrence of losses is also taken into account. Under certain hypotheses,
the model can be exactly solved and, in principle, the solution can be
exploited to estimate most of the model parameters from real data. The
forecasting power of the model is also investigated and proved to be
surprisingly remarkable.
|
finance
|
4,265 |
Mathematical Definition, Mapping, and Detection of (Anti)Fragility
|
q-fin.RM
|
We provide a mathematical definition of fragility and antifragility as
negative or positive sensitivity to a semi-measure of dispersion and volatility
(a variant of negative or positive "vega") and examine the link to nonlinear
effects. We integrate model error (and biases) into the fragile or antifragile
context. Unlike risk, which is linked to psychological notions such as
subjective preferences (hence cannot apply to a coffee cup) we offer a measure
that is universal and concerns any object that has a probability distribution
(whether such distribution is known or, critically, unknown). We propose a
detection of fragility, robustness, and antifragility using a single
"fast-and-frugal", model-free, probability free heuristic that also picks up
exposure to model error. The heuristic lends itself to immediate
implementation, and uncovers hidden risks related to company size, forecasting
problems, and bank tail exposures (it explains the forecasting biases). While
simple to implement, it outperforms stress testing and other such methods such
as Value-at-Risk.
|
finance
|
4,266 |
Hedging Swing contract on gas markets
|
q-fin.RM
|
Swing options on the gas market are american style option where daily
quantities exercices are constrained and global quantities exerciced each year
constrained too. The option holder has to decide each day how much he consumes
of the quantities satisfying the constraints and tries to use a strategy in
order to maximize its expected profit. The pay off fonction is a spread between
the spot gas market and the value of an index composed of the past average of
some commodities spot or future prices. We study the valorization and the
effectiveness of the dynamic hedging of such a contract.
|
finance
|
4,267 |
Scenarios and their Aggregation in the Regulatory Risk Measurement Environment
|
q-fin.RM
|
We define scenarios, propose different methods of aggregating them, discuss
their properties and benchmark them against quadrant requirements.
|
finance
|
4,268 |
Funding Liquidity, Debt Tenor Structure, and Creditor's Belief: An Exogenous Dynamic Debt Run Model
|
q-fin.RM
|
We propose a unified structural credit risk model incorporating both
insolvency and illiquidity risks, in order to investigate how a firm's default
probability depends on the liquidity risk associated with its financing
structure. We assume the firm finances its risky assets by mainly issuing
short- and long-term debt. Short-term debt can have either a discrete or a more
realistic staggered tenor structure. At rollover dates of short-term debt,
creditors face a dynamic coordination problem. We show that a unique threshold
strategy (i.e., a debt run barrier) exists for short-term creditors to decide
when to withdraw their funding, and this strategy is closely related to the
solution of a non-standard optimal stopping time problem with control
constraints. We decompose the total credit risk into an insolvency component
and an illiquidity component based on such an endogenous debt run barrier
together with an exogenous insolvency barrier.
|
finance
|
4,269 |
Fostering Project Scheduling and Controlling Risk Management
|
q-fin.RM
|
Deployment of emerging technologies and rapid change in industries has
created a lot of risk for initiating the new projects. Many techniques and
suggestions have been introduced but still lack the gap from various
prospective. This paper proposes a reliable project scheduling approach. The
objectives of project scheduling approach are to focus on critical chain
schedule and risk management. Several risks and reservations exist in projects.
These critical reservations may not only foil the projects to be finished
within time limit and budget, but also degrades the quality, and operational
process. In the proposed approach, the potential risks of project are
critically analyzed. To overcome these potential risks, fuzzy failure mode and
effect analysis (FMEA) is introduced. In addition, several affects of each risk
against each activity are evaluated. We use Monte Carlo simulation that helps
to calculate the total time of project. Our approach helps to control risk
mitigation that is determined using event tree analysis and fault tree
analysis. We also implement distribute critical chain schedule for reliable
scheduling that makes the project to be implemented within defined plan and
schedule. Finally, adaptive procedure with density (APD) is deployed to get
reasonable feeding buffer time and project buffer time.
|
finance
|
4,270 |
Russian interbank networks: main characteristics and stability with respect to contagion
|
q-fin.RM
|
Systemic risks characterizing the Russian overnight interbank market from the
network point of view are analyzed.
|
finance
|
4,271 |
Measuring and Analysing Marginal Systemic Risk Contribution using CoVaR: A Copula Approach
|
q-fin.RM
|
This paper is devoted to the quantification and analysis of marginal risk
contribution of a given single financial institution i to the risk of a
financial system s. Our work expands on the CoVaR concept proposed by Adrian
and Brunnermeier as a tool for the measurement of marginal systemic risk
contribution. We first give a mathematical definition of
CoVaR_{\alpha}^{s|L^i=l}. Our definition improves the CoVaR concept by
expressing CoVaR_{\alpha}^{s|L^i=l} as a function of a state l and of a given
probability level \alpha relative to i and s respectively. Based on Copula
theory we connect CoVaR_{\alpha}^{s|L^i=l} to the partial derivatives of Copula
through their probabilistic interpretation and definitions (Conditional
Probability). Using this we provide a closed formula for the calculation of
CoVaR_{\alpha}^{s|L^i=l} for a large class of (marginal) distributions and
dependence structures (linear and non-linear). Our formula allows a better
analysis of systemic risk using CoVaR in the sense that it allows to define
CoVaR_{\alpha}^{s|L^i=l} depending on the marginal distributions of the losses
of i and s respectively and the copula between L^i and L^s. We discuss the
implications of this in the context of the quantification and analysis of
systemic risk contributions. %some mathematical This makes possible the For
example we will analyse the marginal effects of L^i, L^s and C of the risk
contribution of i.
|
finance
|
4,272 |
Solvency assessment within the ORSA framework: issues and quantitative methodologies
|
q-fin.RM
|
The implementation of the Own Risk and Solvency Assessment is a critical
issue raised by Pillar II of Solvency II framework. In particular the Overall
Solvency Needs calculation left the Insurance companies to define an optimal
entity-specific solvency constraint on a multi-year time horizon. In a life
insurance society framework, the intuitive approaches to answer this problem
can sometimes lead to new implementation issues linked to the highly stochastic
nature of the methodologies used to project a company Net Asset Value over
several years. One alternative approach can be the use of polynomial proxies to
replicate the outcomes of this variable throughout the time horizon. Polynomial
functions are already considered as efficient replication methodologies for the
Net Asset Value over 1 year. The Curve Fitting and Least Squares Monte-Carlo
procedures are the best-known examples of such procedures. In this article we
introduce a possibility of adaptation for these methodologies to be used on a
multi-year time horizon, in order to assess the Overall Solvency Needs.
|
finance
|
4,273 |
The role of the Model Validation function to manage and mitigate model risk
|
q-fin.RM
|
This paper describes the current taxonomy of model risk, ways for its
mitigation and management and the importance of the model validation function
in collaboration with other departments to design and implement them.
|
finance
|
4,274 |
Optimal portfolio for a robust financial system
|
q-fin.RM
|
This study presents an ANWSER model (asset network systemic risk model) to
quantify the risk of financial contagion which manifests itself in a financial
crisis. The transmission of financial distress is governed by a heterogeneous
bank credit network and an investment portfolio of banks. Bankruptcy
reproductive ratio of a financial system is computed as a function of the
diversity and risk exposure of an investment portfolio of banks, and the
denseness and concentration of a heterogeneous bank credit network. An analytic
solution of the bankruptcy reproductive ratio for a small financial system is
derived and a numerical solution for a large financial system is obtained. For
a large financial system, Large diversity among banks in the investment
portfolio makes financial contagion more damaging on the average. But large
diversity is essentially effective in eliminating the risk of financial
contagion in the worst case of financial crisis scenarios. A bank-unique
specialization portfolio is more suitable than a uniform diversification
portfolio and a system-wide specialization portfolio in strengthening the
robustness of a financial system.
|
finance
|
4,275 |
Optimal portfolio model based on WVAR
|
q-fin.RM
|
This article is focused on using a new measurement of risk-- Weighted Value
at Risk to develop a new method of constructing initiate from the TVAR solving
problem, based on MATLAB software, using the historical simulation method
(avoiding income distribution will be assumed to be normal), the results of
previous studies also based on, study the U.S. Nasdaq composite index,
combining the Simpson formula for the solution of TVAR and its deeply study;
then, through the representation of WVAR formula discussed and indispensable
analysis, also using the Simpson formula and the numerical calculations, we
have done the empirical analysis and review test. this paper is based on WVAR
which possesses better properties, taking the idea of portfolio into the
multi-index comprehensive evaluation, to build innovative WVAR based portfolio
selection under the framework of a theoretical model; in this framework, a
description of risks is designed by WVAR, its advantage is no influence by
income distribution, meanwhile various optimization problems have a unique
solution; then take AHP weights to different indicators deal on this basis,
after that we put a nonlinear satisfaction portfolio selected model forward and
conduct tests of empirical analysis, finally we use weighted linear approach to
convert the portfolio model into a single-objective problem, which is easier to
solve, then we use the data of two ETFs to construct portfolio, and compare the
performance of portfolio constructed by Mean-Weighted V@R and by Mean-Variance.
|
finance
|
4,276 |
Parameter estimation of a Levy copula of a discretely observed bivariate compound Poisson process with an application to operational risk modelling
|
q-fin.RM
|
A method is developed to estimate the parameters of a Levy copula of a
discretely observed bivariate compound Poisson process without knowledge of
common shocks. The method is tested in a small sample simulation study. Also,
the method is applied to a real data set and a goodness of fit test is
developed. With the methodology of this work, the Levy copula becomes a
realistic tool of the advanced measurement approach of operational risk.
|
finance
|
4,277 |
Risk Measures in a Regime Switching Model Capturing Stylized Facts
|
q-fin.RM
|
We pick up the regime switching model for asset returns introduced by Rogers
and Zhang. The calibration involves various markets including implied
volatility in order to gain additional predictive power. We focus on the
calculation of risk measures by Fourier methods that have successfully been
applied to option pricing and analyze the accuracy of the results.
|
finance
|
4,278 |
Multiportfolio time consistency for set-valued convex and coherent risk measures
|
q-fin.RM
|
Equivalent characterizations of multiportfolio time consistency are deduced
for closed convex and coherent set-valued risk measures on $L^p(\Omega,\mathcal
F, P; R^d)$ with image space in the power set of $L^p(\Omega,\mathcal
F_t,P;R^d)$. In the convex case, multiportfolio time consistency is equivalent
to a cocycle condition on the sum of minimal penalty functions. In the coherent
case, multiportfolio time consistency is equivalent to a generalized version of
stability of the dual variables. As examples, the set-valued entropic risk
measure with constant risk aversion coefficient is shown to satisfy the cocycle
condition for its minimal penalty functions, the set of superhedging portfolios
in markets with proportional transaction costs is shown to have the stability
property and in markets with convex transaction costs is shown to satisfy the
composed cocycle condition, and a multiportfolio time consistent version of the
set-valued average value at risk, the composed AV@R, is given and its dual
representation deduced.
|
finance
|
4,279 |
The Foster-Hart Measure of Riskiness for General Gambles
|
q-fin.RM
|
Foster and Hart proposed an operational measure of riskiness for discrete
random variables. We show that their defining equation has no solution for many
common continuous distributions including many uniform distributions, e.g. We
show how to extend consistently the definition of riskiness to continuous
random variables. For many continuous random variables, the risk measure is
equal to the worst--case risk measure, i.e. the maximal possible loss incurred
by that gamble. We also extend the Foster--Hart risk measure to dynamic
environments for general distributions and probability spaces, and we show that
the extended measure avoids bankruptcy in infinitely repeated gambles.
|
finance
|
4,280 |
A new approach for an unitary risk theory
|
q-fin.RM
|
The work deals with the risk assessment theory. An unitary risk algorithm is
elaborated. The algorithm is based on parallel curves. The basic curve of risk
is a hyperbolic curve, obtained as a multiplication between the probability of
occurrence of certain event and its impact. Section 1 contains the problem
formulation. Section 2 contains some specific notations and the mathematical
background of risk algorithm. A numerical application based on risk algorithm
is the content of section 3. Section 4 contains several conclusions.
|
finance
|
4,281 |
Quantifying the Impact of Leveraging and Diversification on Systemic Risk
|
q-fin.RM
|
Excessive leverage, i.e. the abuse of debt financing, is considered one of
the primary factors in the default of financial institutions. Systemic risk
results from correlations between individual default probabilities that cannot
be considered independent. Based on the structural framework by Merton (1974),
we discuss a model in which these correlations arise from overlaps in banks'
portfolios. Portfolio diversification is used as a strategy to mitigate losses
from investments in risky projects. We calculate an optimal level of
diversification that has to be reached for a given level of excessive leverage
to still mitigate an increase in systemic risk. In our model, this optimal
diversification further depends on the market size and the market conditions
(e.g. volatility). It allows to distinguish between a safe regime, in which
excessive leverage does not result in an increase of systemic risk, and a risky
regime, in which excessive leverage cannot be mitigated leading to an increased
systemic risk. Our results are of relevance for financial regulators.
|
finance
|
4,282 |
Premiums And Reserves, Adjusted By Distortions
|
q-fin.RM
|
The net-premium principle is considered to be the most genuine and fair
premium principle in actuarial applications. However, an insurance company,
applying the net-premium principle, goes bankrupt with probability one in the
long run, even if the company covers its entire costs by collecting the
respective fees from its customers. It is therefore an intrinsic necessity for
the insurance industry to apply premium principles, which guarantee at least
further existence of the company itself; otherwise, the company naturally could
not insure its clients to cover their potential, future claims. Beside this
intriguing fact the underlying loss distribution typically is not known
precisely. Hence alternative premium principles have been developed. A simple
principle, ensuring risk-adjusted credibility premiums, is the distorted
premium principle. This principle is convenient in insurance companies, as the
actuary does not have to change his or her tools to compute the premiums or
reserves. This paper addresses the distorted premium principle from various
angles. First, dual characterizations are developed. Next, distorted premiums
are typically computed by under-weighting or ignoring low, but over-weighting
high losses. It is demonstrated here that there is an alternative, opposite
point of view, which consists in leaving the probability measure unchanged, but
increasing the outcomes instead. It turns out that this new point of view is
natural in actuarial practice, as it can be used for premium calculations, as
well as to determine the reserves of subsequent years in a time consistent way.
|
finance
|
4,283 |
Measuring the default risk of sovereign debt from the perspective of network
|
q-fin.RM
|
Recently, there has been a growing interest in network research, especially
in these fields of biology, computer science, and sociology. It is natural to
address complex financial issues such as the European sovereign debt crisis
from the perspective of network. In this article, we construct a network model
according to the debt--credit relations instead of using the conventional
methodology to measure the default risk. Based on the model, a risk index is
examined using the quarterly report of consolidated foreign claims from the
Bank for International Settlements (BIS) and debt/GDP ratios among these
reporting countries. The empirical results show that this index can help the
regulators and practitioners not only to determine the status of
interconnectivity but also to point out the degree of the sovereign debt
default risk. Our approach sheds new light on the investigation of quantifying
the systemic risk.
|
finance
|
4,284 |
Central Clearing of OTC Derivatives: bilateral vs multilateral netting
|
q-fin.RM
|
We study the impact of central clearing of over-the-counter (OTC)
transactions on counterparty exposures in a market with OTC transactions across
several asset classes with heterogeneous characteristics. The impact of
introducing a central counterparty (CCP) on expected interdealer exposure is
determined by the tradeoff between multilateral netting across dealers on one
hand and bilateral netting across asset classes on the other hand. We find this
tradeoff to be sensitive to assumptions on heterogeneity of asset classes in
terms of `riskyness' of the asset class as well as correlation of exposures
across asset classes. In particular, while an analysis assuming independent,
homogeneous exposures suggests that central clearing is efficient only if one
has an unrealistically high number of participants, the opposite conclusion is
reached if differences in riskyness and correlation across asset classes are
realistically taken into account. We argue that empirically plausible
specifications of model parameters lead to the conclusion that central clearing
does reduce interdealer exposures: the gain from multilateral netting in a CCP
overweighs the loss of netting across asset classes in bilateral netting
agreements. When a CCP exists for interest rate derivatives, adding a CCP for
credit derivatives is shown to decrease overall exposures. These findings are
shown to be robust to the statistical assumptions of the model as well as the
choice of risk measure used to quantify exposures.
|
finance
|
4,285 |
Mean-Variance Asset-Liability Management with State-Dependent Risk Aversion
|
q-fin.RM
|
In this paper, we consider the asset-liability management under the
mean-variance criterion. The financial market consists of a risk-free bond and
a stock whose price process is modeled by a geometric Brownian motion. The
liability of the investor is uncontrollable and is modeled by another geometric
Brownian motion. We consider a specific state-dependent risk aversion which
depends on a power function of the liability. By solving a flow of FBSDEs with
bivariate state process, we obtain the equilibrium strategy among all the
open-loop controls for this time-inconsistent control problem. It shows that
the equilibrium strategy is a feedback control of the liability.
|
finance
|
4,286 |
A comparison of techniques for dynamic multivariate risk measures
|
q-fin.RM
|
This paper contains an overview of results for dynamic multivariate risk
measures. We provide the main results of four different approaches. We will
prove under which assumptions results within these approaches coincide, and how
properties like primal and dual representation and time consistency in the
different approaches compare to each other.
|
finance
|
4,287 |
Consistent iterated simulation of multi-variate default times: a Markovian indicators characterization
|
q-fin.RM
|
We investigate under which conditions a single simulation of joint default
times at a final time horizon can be decomposed into a set of simulations of
joint defaults on subsequent adjacent sub-periods leading to that final
horizon. Besides the theoretical interest, this is also a practical problem as
part of the industry has been working under the misleading assumption that the
two approaches are equivalent for practical purposes. As a reasonable trade-off
between realistic stylized facts, practical demands, and mathematical
tractability, we propose models leading to a Markovian multi-variate
survival--indicator process, and we investigate two instances of static models
for the vector of default times from the statistical literature that fall into
this class. On the one hand, the "looping default" case is known to be equipped
with this property, and we point out that it coincides with the classical
"Freund distribution" in the bivariate case. On the other hand, if all
sub-vectors of the survival indicator process are Markovian, this constitutes a
new characterization of the Marshall--Olkin distribution, and hence of
multi-variate lack-of-memory. A paramount property of the resulting model is
stability of the type of multi-variate distribution with respect to elimination
or insertion of a new marginal component with marginal distribution from the
same family. The practical implications of this "nested margining" property are
enormous. To implement this distribution we present an efficient and unbiased
simulation algorithm based on the L\'evy-frailty construction. We highlight
different pitfalls in the simulation of dependent default times and examine,
within a numerical case study, the effect of inadequate simulation practices.
|
finance
|
4,288 |
A Financial Risk Analysis: Does the 2008 Financial Crisis Give Impact on Weekends Returns of the U.S. Movie Box Office?
|
q-fin.RM
|
The Financial Crisis of 2008 is a worldwide financial crisis causing a
worldwide economic decline that is the most severe since the 1930s. According
to the International Monetary Fund (IMF), the global financial crisis gave
impact on USD 3.4 trillion losses from financial institutions around the world
between 2007 and 2010. Does the crisis give impact on the returns of the U.S.
movie Box Office? It will be answered by doing an analysis on the financial
risk model based on Extreme Value Theory (EVT) and calculations of Value at
Risk (VaR) and Expected Shortfall (ES). The values of VaR and ES from 2
periods, 1982 to 1995 and 1996 to 2010, are compared. Results show that the
possibility of loss for an investment in the movie industry is relatively lower
than the possibility of gain for both periods of time. The values of VaR and ES
for the second period are higher than the first period. We are able to conclude
that the 2008 financial crisis gave no significant effect on these measurement
values in the second period. This result describes the high potential
opportunity in the investment of the U.S. movie makers.
|
finance
|
4,289 |
Ruin probability of a discrete-time risk process with proportional reinsurance and investment for exponential and Pareto distributions
|
q-fin.RM
|
In this paper a quantitative analysis of the ruin probability in finite time
of discrete risk process with proportional reinsurance and investment of
finance surplus is focused on. It is assumed that the total loss on a unit
interval has a light-tailed distribution -- exponential distribution and a
heavy-tailed distribution -- Pareto distribution. The ruin probability for
finite-horizon 5 and 10 was determined from recurrence equations. Moreover for
exponential distribution the upper bound of ruin probability by Lundberg
adjustment coefficient is given. For Pareto distribution the adjustment
coefficient does not exist, hence an asymptotic approximation of the ruin
probability if an initial capital tends to infinity is given. Obtained
numerical results are given as tables and they are illustrated as graphs.
|
finance
|
4,290 |
Computational Dynamic Market Risk Measures in Discrete Time Setting
|
q-fin.RM
|
Different approaches to defining dynamic market risk measures are available
in the literature. Most are focused or derived from probability theory,
economic behavior or dynamic programming. Here, we propose an approach to
define and implement dynamic market risk measures based on recursion and state
economy representation. The proposed approach is to be implementable and to
inherit properties from static market risk measures.
|
finance
|
4,291 |
The Meaning of Probability of Default for Asset-backed Loans
|
q-fin.RM
|
The authors examine the concept of probability of default for asset-backed
loans. In contrast to unsecured loans it is shown that probability of default
can be defined as either a measure of the likelihood of the borrower failing to
make required payments, or as the likelihood of an insufficiency of collateral
value on foreclosure. Assuming expected loss is identical under either
definition, this implies a corresponding pair of definitions for loss given
default. Industry treatment of probability of default for asset-backed loans
appears to inconsistently blend the two types of definition.
The authors develop a mathematical treatment of asset-backed loans which
consistently applies each type of definition in a framework to produce the same
expected loss and allows translation between the two frameworks.
|
finance
|
4,292 |
Assessing Financial Model Risk
|
q-fin.RM
|
Model risk has a huge impact on any risk measurement procedure and its
quantification is therefore a crucial step. In this paper, we introduce three
quantitative measures of model risk when choosing a particular reference model
within a given class: the absolute measure of model risk, the relative measure
of model risk and the local measure of model risk. Each of the measures has a
specific purpose and so allows for flexibility. We illustrate the various
notions by studying some relevant examples, so as to emphasize the
practicability and tractability of our approach.
|
finance
|
4,293 |
Contraction or steady state? An analysis of credit risk management in Italy in the period 2008-2012
|
q-fin.RM
|
Credit risk management in Italy is characterized, in the period June 2008 to
June 2012, by frequent (frequency=0.5 cycles per year) and intense (peak
amplitude: mean=39.2 billion Euros, s.e.=2.83 billion Euros) quarterly
contractions and expansions around the mean (915.4 billion Euros, s.e.=3.59
billion Euros) of the nominal total credit used by non-financial corporations.
Such frequent and intense fluctuations are frequently ascribed to exogenous
Basel II procyclical effects on credit flow into the economy and, consequently,
Basel III output based point in time Credit to GDP countercyclical buffering
advocated. We have tested the opposite null hypotheses that such variation is
significantly correlated to actual default rates, and that such correlation is
explained by fluctuations of credit supply around a steady state. We have found
that, in the period June 2008 to June 2012 (n=17), linear regression of credit
growth rates on default rates reveals a negative correlation of r=minus 0.6903
with R squared=0.4765, and that credit supply fluctuates steadily around the
default rate with an Internal Steady State Parameter SSP=0.00245 with chi
squared=37.47 (v=16, P<.005). We conclude that fluctuations of the total credit
used by non-financial corporations are exhaustively explained by variation of
the independent variable default rate, and that credit variation fluctuates
around a steady state. We conclude that credit risk management in Italy has
been effective in parameterizing credit supply variation to default rates
within the Basel II operating framework. Basel III prospective countercyclical
point in time output buffers based on filtered Credit to GDP ratios and dynamic
provisioning proposals should take into account this underlying steady state
statistical pattern.
|
finance
|
4,294 |
Efficient immunization strategies to prevent financial contagion
|
q-fin.RM
|
Many immunization strategies have been proposed to prevent infectious viruses
from spreading through a network. In this study, we propose efficient
immunization strategies to prevent a default contagion that might occur in a
financial network. An essential difference from the previous studies on
immunization strategy is that we take into account the possibility of serious
side effects. Uniform immunization refers to a situation in which banks are
"vaccinated" with a common low-risk asset. The riskiness of immunized banks
will decrease significantly, but the level of systemic risk may increase due to
the de-diversification effect. To overcome this side effect, we propose another
immunization strategy, counteractive immunization, which prevents pairs of
banks from failing simultaneously. We find that counteractive immunization can
efficiently reduce systemic risk without altering the riskiness of individual
banks.
|
finance
|
4,295 |
Network versus portfolio structure in financial systems
|
q-fin.RM
|
The question of how to stabilize financial systems has attracted considerable
attention since the global financial crisis of 2007-2009. Recently, Beale et
al. ("Individual versus systemic risk and the regulator's dilemma", Proc Natl
Acad Sci USA 108: 12647-12652, 2011) demonstrated that higher portfolio
diversity among banks would reduce systemic risk by decreasing the risk of
simultaneous defaults at the expense of a higher likelihood of individual
defaults. In practice, however, a bank default has an externality in that it
undermines other banks' balance sheets. This paper explores how each of these
different sources of risk, simultaneity risk and externality, contributes to
systemic risk. The results show that the allocation of external assets that
minimizes systemic risk varies with the topology of the financial network as
long as asset returns have negative correlations. In the model, a well-known
centrality measure, PageRank, reflects an appropriately defined "infectiveness"
of a bank. An important result is that the most infective bank need not always
be the safest bank. Under certain circumstances, the most infective node should
act as a firewall to prevent large-scale collective defaults. The introduction
of a counteractive portfolio structure will significantly reduce systemic risk.
|
finance
|
4,296 |
Measuring risk with multiple eligible assets
|
q-fin.RM
|
The risk of financial positions is measured by the minimum amount of capital
to raise and invest in eligible portfolios of traded assets in order to meet a
prescribed acceptability constraint. We investigate nondegeneracy, finiteness
and continuity properties of these risk measures with respect to multiple
eligible assets. Our finiteness and continuity results highlight the interplay
between the acceptance set and the class of eligible portfolios. We present a
simple, alternative approach to the dual representation of convex risk measures
by directly applying to the acceptance set the external characterization of
closed, convex sets. We prove that risk measures are nondegenerate if and only
if the pricing functional admits a positive extension which is a supporting
functional for the underlying acceptance set, and provide a characterization of
when such extensions exist. Finally, we discuss applications to set-valued risk
measures, superhedging with shortfall risk, and optimal risk sharing.
|
finance
|
4,297 |
Analytical models of operational risk and new results on the correlation problem
|
q-fin.RM
|
We propose a portfolio approach for operational risk quantification based on
a class of analytical models from which we derive new results on the
correlation problem. In particular, we show that uniform correlation is a
robust assumption for measuring capital charges in these models.
|
finance
|
4,298 |
Credit Risk and the Instability of the Financial System: an Ensemble Approach
|
q-fin.RM
|
The instability of the financial system as experienced in recent years and in
previous periods is often linked to credit defaults, i.e., to the failure of
obligors to make promised payments. Given the large number of credit contracts,
this problem is amenable to be treated with approaches developed in statistical
physics. We introduce the idea of ensemble averaging and thereby uncover
generic features of credit risk. We then show that the often advertised concept
of diversification, i.e., reducing the risk by distributing it, is deeply
flawed when it comes to credit risk. The risk of extreme losses remain due to
the ever present correlations, implying a substantial and persistent intrinsic
danger to the financial system.
|
finance
|
4,299 |
Continuous compliance: a proxy-based monitoring framework
|
q-fin.RM
|
Within the Own Risk and Solvency Assessment framework, the Solvency II
directive introduces the need for insurance undertakings to have efficient
tools enabling the companies to assess the continuous compliance with
regulatory solvency requirements. Because of the great operational complexity
resulting from each complete evaluation of the Solvency Ratio, this monitoring
is often complicated to implement in practice. This issue is particularly
important for life insurance companies due to the high complexity to project
life insurance liabilities. It appears relevant in such a context to use
parametric tools, such as Curve Fitting and Least Squares Monte Carlo in order
to estimate, on a regular basis, the impact on the economic own funds and on
the regulatory capital of the company of any change over time of its underlying
risk factors. In this article, we first outline the principles of the
continuous compliance requirement then we propose and implement a possible
monitoring tool enabling to approximate the eligible elements and the
regulatory capital over time. In a final section we compare the use of the
Curve Fitting and the Least Squares Monte Carlo methodologies in a standard
empirical finite sample framework, and stress adapted advices for future
proxies users.
|
finance
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.