Optimizing Optimization: The Next Generation of Optimization Applications and Theory (Quantitative Finance)

(Romina) #1

296 Optimizing Optimization


together ” using copula functions. Owing to the sophistication required for this
approach, almost all CVaR models used in practice work with the uncondi-
tional distribution. The year 2008 surely taught us that this is not a good idea.
Finally, users of CVaR should also be aware that we have no established lit-
erature on building multivariate nonnormal predictive distributions. 15 This is a
major disadvantage relative to the well-developed literature on Bayesian meth-
ods in combination with multivariate normal distributions, in particular, since
CVaR is even more sensitive to estimation error than variance.


12.6 Axiomatic difficulties: Who has CVaR preferences anyway?


Let us first recall the definition of CVaR. For a confidence level of, for exam-
ple, 95%, we simply average the 5% worst-case returns for a given portfolio
and scenario matrix to arrive at CVaR. However, averaging worst-case returns
(i.e., giving them equal weights) essentially assumes that an investor is risk neu-
tral in the tail of the distribution of future wealth. The fact that CVaR attaches
equal weight to extreme losses is inconsistent with the most basic economic
axiom used in our very first (micro) economics class: investors prefer more to
less at a decreasing rate. As a corollary, they do certainly not place the same
weight (disutility) on a large loss and total ruin. Although CVaR is a coherent
risk measure (it ticks all boxes on statistical axioms), it does fail well-accepted
economic axioms we all have accepted in our first microeconomics class. We
could, of course (and some have done that 16 ), introduce utility functions that
are linear below a particular threshold value to technically conform to expected
utility maximization.
Where do we go from here? Are we stuck in a dead end? Recognizing the
shortcomings of VaR and CVaR , Acerbi (2004) introduced the so-called spec-
tral risk measures as the newest innovation in the risk manager’s toolbox.
Spectral risk measures attach different weights to the i th quantile of a distribu-
tion. They are coherent risk measures as long as the weightings the quantile
receives are a nondecreasing function of the quantile. In other words, the 96%
quantile must get at least the same weighting than the 95% quantile. This is in
stark contrast to VaR where, for example, the 95% is assigned a weighting of 1,
whereas all other quantiles get a weight of 0. CVaR , however, is coherent
(attaches equal weight, i.e., nondecreasing weight to all quantiles above VaR ).
It still has the unpleasant property that investors evaluate losses larger than


16 Kahneman and Tversky (1979) and their Nobel prize – winning work focused on deriving utility
functions from experiments. This is somehow odd, as the scientists ’ role is to provide normative
advice and guide individuals to better decision making rather than “ cultivating ” their biases.


15 With the exception of multivariate mixtures of normal distributions that are difficult to esti-
mate and even more demanding to put informative priors on (we need priors for two covariance
matrices and two return vectors).

Free download pdf