Data Mining: Practical Machine Learning Tools and Techniques, Second Edition

(Brent) #1

6.7 BAYESIAN NETWORKS 273


outlook;the right gives a probability for each value oftemperature. For example,
the first number (0.143) is the probability oftemperaturetaking on the value
hot,given that playand outlookhave values yesand sunny,respectively.
How are the tables used to predict the probability of each class value for a
given instance? This turns out to be very easy, because we are assuming that
there are no missing values. The instance specifies a value for each attribute. For
each node in the network, look up the probability of the node’s attribute value
based on the row determined by its parents’ attribute values. Then just multi-
ply all these probabilities together.


play

play
yes
.633

no
.367

outlook

play

yes
no

outlook
sunny
.238
.538

overcast
.429
.077

rainy
.333
.385

temperature

play

yes
no

temperature
hot
.238
.385

mild
.429
.385

cool
.333
.231

windy

play

yes
no

windy
false
.350
.583

true
.650
.417

humidity

play

yes
no

humidity
high
.350
.750

normal
.650
.250

Figure 6.20A simple Bayesian network for the weather data.

Free download pdf