Optimizing Optimization: The Next Generation of Optimization Applications and Theory (Quantitative Finance)

(Romina) #1

144 Optimizing Optimization


requirements for what they call a coherent risk measure. VaR and volatility are
known to violate the coherency assumptions. A popular risk measure that does
meet all requirements is expected shortfall.
Intuitively , the practical consequences of the departure from the basic proper-
ties of risk measures are likely to be exacerbated by the need to track risk over
time. For example, it is widely accepted that sudden increases in market volatil-
ity tend to be driven by the downside, rather than the upside, in the return dis-
tribution. In fact, it is well known that surges in market volatility coincide with
economic downturns. 1 In a dynamic model, a symmetric measure like volatility
may overstate or understate risk in crucial phases of the business cycle.
Another example is provided by the analysis of the comovements in asset
returns when extreme events occur. The academic literature has often argued
that asset returns are strongly dependent in the lower tail of the distribution,
more than is implicitly assumed by a Gaussian model. 2 In other words, when
assets experience heavy losses, the covariance matrix is of little help in deter-
mining the likely loss for the whole portfolio. The practical implications of
these findings have been felt by portfolio managers in July 2007, when tradi-
tional quantitative factors seemed to stop working all at once and markets sud-
denly seemed to offer no place to hide.
Countless academic papers have highlighted the limitations of the Markowitz
approach. Nevertheless, a standard approach that can supersede mean – variance
is yet to emerge. Madan (2006) proposed a non-Gaussian factor model. The
dynamics of the factors is tracked by independent component analysis and the
portfolio setup by maximizing a utility function. Menc ì a and Sentana (2008)
and Jondeau and Rockinger (2006) are recent examples of portfolio alloca-
tion with higher moments. This approach faces formidable challenges because
the multivariate density is difficult to model. Rockafellar and Uryasev (2000)
developed a method to optimize a portfolio’s CVaR. A similar approach, cast
in a utility maximization framework, is taken by Basset, Koenker, and Kordas
(2004). Finally, Meucci (2006) adopted a Bayesian approach. He extended the
Black – Litterman model to a non-Gaussian context, deriving simulation-based
methods to obtain the optimal allocation.
This chapter presents a dynamic model centered on a risk measure that
has been recently advocated in the academic literature, expectile value at risk
(EVaR). Intuitively, EVaR is closely related to VaR in that it can be interpreted
as an optimal level of reserve capital requirement. Both minimize an asym-
metric cost function, which strikes a balance between the expected oppor-
tunity cost of holding excessive reserves and the expected cost of losses that
exceed the reserve capital. The advantages of measuring risk through EVaR are
detailed in the next section. In short, EVaR satisfies all the basic properties of
a coherent measure of risk, it is mathematically tractable, and lends itself to a
simple interpretation in a utility maximization framework.


1 See, among others, Schwert (1989) and Engle and Rangel (2008).
2 See Patton (2004) and references therein.

Free download pdf