Final_1.pdf

(Tuis.) #1

(11.7)


Taking logarithms on both sides, we now have


(11.8)


Let us see how this equation may be interpreted. First, when t 2 – t 1 is a
small quantity, the above becomes a difference equation. The left-hand side
is the difference in the logarithms of the probabilities. The negative loga-
rithm of probabilities can be interpreted as a measure of information con-
tent. Therefore, the left-hand side may be interpreted as the change in
information content between the times t 1 andt 2. The right-hand side has a
term consisting of the difference in the logarithm of the spreads. This is, in
fact, the unrealized profit per target share. The other term on the right-hand
side represents the risk-free return. Therefore, the equation may be inter-
preted as saying that any return in excess of the risk-free rate is equal to the
change in the information content, that is, the reduction in the uncertainty
of deal break.
Once we have the initial probability value, that is, , all the proba-
bilities in the interval [0,T] can be evaluated using the difference Equation
11.8. We have thus eliminated the requirement to make a guess at the spread
value in case of deal break. This is replaced with the assessment of the ini-
tial deal break probability. As a practical matter, the risk-free rate values in
Equations 11.7 and 11.8 are rather negligible. We can therefore set r= 0 in
our calculations.


πfailure^0

log()ππfailuret^21 −log()failuret =− −rt() 21 t +log()Stt 21 −log()S

ππfailurettfailure
rt t
eSStt
21 11
= 11
( − )

The Market Implied Merger Probability 179


LOGARITHMS AND INFORMATION THEORY


The interpretation of the negative logarithm of probabilities as the in-
formation content was originally proposed by Claude E. Shannon in
his groundbreaking article “A Mathematical Theory of Communica-
tion,” published in 1948.^2 Shannon was then working at Bell Labora-
tories. The powerful ideas describing the ways to measure rates of
information flow very soon became a discipline in its own right called
information theory. As a matter of fact, the word bit(as in bits per sec-
ond) that is so commonly used today is attributed to Shannon.

(^2) “A Mathematical Theory of Communication,’’ Bell System Technical Journal
27 (July and October 1948): 379–423; 623–656.

Free download pdf