Titel_SS06

(Brent) #1

 








>?


00 10 01 11
22 -4 4

' min{ ' 0 ' 100000, ' 1000 ' 101000}
min{(1 1.15 10 ) 0 1.15 10 100000, (1-1.33 10 ) 1000 1.33 10 101000}
min 1150,1013 1013

Eu Pa Pa:: Pa: :Pa
 

   


     








The decision tree is illustrated in Figure 12.6 together with the utilities:


Figure 12.6: Simple decision problem with assigned prior probabilities and utility (costs).


The expected costs are shown in Figure 12.6 in boxes. It is seen that the action alternative a


yields the largest expected utility (smallest cost) and, therefore this action alternative is the
optimal decision.


0

Decision analysis with new information


When additional information becomes available the probability model underlying the decision
problem may be updated. Having updated the probability structure the reassessment decision
analysis is unchanged in comparison to the situation with given prior information.


Given an observation or the result of an experiment xˆ the updated probability structure (or
just the posterior probability) is denoted Px@@
: ˆ and may be evaluated by use of Bayes’s


rule see e.g. Lindley (1976).









ˆ '


''( ˆ)


ˆ '


::


:


::


ii
i
j j

Px P
Px
Px P
j







(12.6)


which, may be explained as:


Posterior probability of i (^) Normalising Sample likelihood prior probability
with given sample outcome constant given of
:
::


!!!


"#"#" #"


$%$%$ %$


!



%


The normalising factor is to ensure thatP@@
:i forms a proper probability. The mixing of new


a 0

a 1

1150 $

1013 $

du:

:<

0 $

100000 $

1000 $

101000 $

Decision Event Consequence

:<

:R: 0


a 0

: 1

:: 0 R

a 1

: 1
Free download pdf