Data Mining: Practical Machine Learning Tools and Techniques, Second Edition

(Brent) #1

10.4 LEARNING ALGORITHMS 407


the confidence threshold for pruning (default 0.25), and the minimum number
of instances permissible at a leaf (default 2). Instead of standard C4.5 pruning
you can choose reduced-error pruning (Section 6.2). The numFoldsparameter
(default 3) determines the size of the pruning set: the data is divided equally
into that number of parts and the last one used for pruning. When visualizing
the tree (pages 377–378) it is nice to be able to consult the original data points,
which you can do ifsaveInstanceDatahas been turned on (it is off, or False,by
default to reduce memory requirements). You can suppress subtree raising,
yielding a more efficient algorithm; force the algorithm to use the unpruned
tree instead of the pruned one; or use Laplace smoothing for predicted proba-
bilities (Section 4.2).
Table 10.5 shows many other decision tree methods.Id3is the basic algo-
rithm explained in Chapter 4.DecisionStump,designed for use with the boost-
ing methods described later, builds one-level binary decision trees for datasets
with a categorical or numeric class, dealing with missing values by treating them
as a separate value and extending a third branch from the stump. Trees built by
RandomTreechooses a test based on a given number of random features at each
node, performing no pruning.RandomForestconstructs random forests by
bagging ensembles of random trees (Section 7.5, pages 320–321).
REPTreebuilds a decision or regression tree using information gain/variance
reduction and prunes it using reduced-error pruning (Section 6.2, page
203). Optimized for speed, it only sorts values for numeric attributes once


Figure 10.19Changing the parameters for J4.8.

Free download pdf