Handbook of Corporate Finance Empirical Corporate Finance Volume 1

(nextflipdebug5) #1

Ch. 2: Self-Selection Models in Corporate Finance 57



  1. Bayesian self-selection models


Thus far, our discussion covered inference via classical statistical methods. An alterna-
tive approach towards estimating selection models involves Bayesian methods. These
techniques often represent an elegant way of handling selection models that are compu-
tationally too burdensome to be practical for classical methods. We review the Bayesian
approach briefly and illustrate their potential value by discussing a class of selection
models based on Markov Chain Monte Carlo (MCMC) simulations (seePoirier (1995)
for a more in-depth comparison between Bayesian and classical statistical inferences).


6.1. Bayesian methods


The Bayesian approach begins by specifying a prior distribution over parameters that
must be estimated. The prior reflects the information known to the researcher without
reference to the dataset on which the model is estimated. In time series context, a prior
can be formed by looking at out of sample historical data. In most empirical corporate
finance applications, which are cross-sectional in nature, researchers tend to be agnostic
and use non-informative diffuse priors.
Denote the parameters to be estimated byθand the prior beliefs about these parame-
ters by the densityp(θ). If the observed sample isy, the posterior density ofθgiven the
sample can be written as


p(θ|y)= (27)

P(y|θ)p(θ)
p(y)

,


wherep(y|θ)denotes the likelihood function of the econometric model being estimated.
Given the prior and the econometric model, equation(27)employs Bayes rule to gen-
erate the posterior distributionp(θ|y)about parameterθ. The posterior densityp(θ|y)
summarizes what one learns aboutθafter seeing the data. It is the central object of
interest that Bayesian approaches wish to estimate.
A key difficulty in implementing Bayesian methods is the computation of the poste-
rior. Except for a limited class of priors and models, posteriors do not have closed-form
analytic expressions, which poses computational difficulties in implementing Bayesian
models. However, recent advances in computational technology and more importantly,
the advent of the Gibbs sampler and the Metropolis–Hastings algorithm, which are spe-
cific implementations of MCMC methods, simplify implementation of fairly complex
Bayesian models. In some cases, it even provides a viable route for model estimation
where classical methods prove to be computationally intractable.Chib and Greenberg
(1996)andKoop (2003)provide more detailed discussions of these issues.


R^2 (see, e.g.,Campa and Kedia, 2002,andVillalonga, 2004, for interesting illustrations of this point). The
highR^2 should not be misattributed to the explanatory power of the included variables, because they often
arise due to the (ultimately unexplained) fixed effects.

Free download pdf