Advances in Risk Management

(Michael S) #1
OLHA BODNAR 261

tij∼N(0, 1)i=1,...,q,j=1,...,i, andtiiandtijare mutually independently distributed.
Hence, the expectation of the second term is equal to


E(


n− 1


H ̃( 22 −)b ̃ˆ−^12 (w ̃ˆM;q−wM;q))=


H ̃( 22 −)b ̃−^12




E(t 11 )
..
.
E(tqq)


(w ̃
M;q−wM;q)

DenoteA=diag(a 11 ,...,aqq)withaii=

B

(n−p−q−i+ 1 + 1
2 ,
n−p− 1
2

)

B

(n−p−q−i+ 1
2 ,
n−p
2

) ,i=1,q. The first part of

the theorem is proved.


(b) Here we denotec=



n− 1


H ̃( 22 −)b ̃ˆ−^12 (w ̃ˆM;q−wM;q). Then it holds that:

E(vˆ ̃vˆ ̃


)=E


(n−p)
Hˆ ̃(−)
22
(n−1)H ̃
(−)
22


E(cc′)=(n−p)
2



(n−p− 1
2

)



(n−p
2

) (Var(c)+E(c)E(c)′)

To evaluate the covariance matrix of the random vectorcwe apply the following formula
of conditional variance:


Var(c)=E(Var(c|(n−1)−^1 ˆ ̃b))+Var(E(c|(n−1)−^1 bˆ ̃)) (13.21)

From equation (26.20) it follows thatVar(c|(n−1)−^1 bˆ ̃)=Iand, respectively,


E(Var(c|(n−1)−^1 bˆ ̃))=I (13.22)

Let consider the second term in equation (13.21). It follows from the proof of part (a)
that


E(c|(n−1)−^1 ˆ ̃b)=


n− 1


H ̃( 22 −)b ̃−^12 T(W ̃M;q−wM;q)

whereTis a lower triangle random matrix.
Let us denoteW=(w ̃M;q−wM;q)(w ̃M;q−wM;q)′=(wij)i,j=1,...,q. Then


Var(E(c|(n−1)−^1 bˆ ̃))=H ̃
(−)
22 ̃b
−^12 E(TWT′)b ̃−^12 ′

−H ̃( 22 −) ̃b−

(^12)
E(T)WE(T′)b ̃−
12 ′
=H ̃( 22 −)b ̃−
(^12)
(G−F) ̃b−
12 ′
whereG=(gij)i,j=1,...,qandF=(fij)i,j=1,...,qwith
gij=E(
∑i
l= 1
∑j
k= 1
tilwlktjk)=wijE(tiitjj)








2 wii

(n−p+q−i+ 1
2 +^1
)

(n−p+q−i+ 1
2
),ifi=j
2 wij

(n−p+q−i+ 1 + 1
2
)

(n−p+q−i+ 1
2
) 
(n−p+q−j+ 1 + 1
2
)

(n−p+q−j+ 1
2
) otherwise
.

Free download pdf