Microsoft® SQL Server® 2012 Bible

(Ben Green) #1

1286


Part IX: Business Intelligence


In addition to the optional model fi lter, each mining model has both properties and algo-
rithm parameters. Select a model (column) to view and change the properties common to
all algorithms including Name, Description, and AllowDrillThrough. Right-click a model, and
choose Set Algorithm Parameters to change an algorithm’s default settings.

When both the structure and model defi nitions are in place, you must deploy the structure
to the target server to process and train the models. The process to deploy a model consists
of two parts. First, the structure or changes are sent to the server in the build phase. Then
Analysis Services caches data and trains the models using data not held back for testing.

For information on setting up an Analysis Services project for deployment, see Chapter 53.

After processing, the Mining Model Viewer tab contains processing results; here one or
more viewers are available depending on which models are included in the structure. The
algorithm-specifi c viewers assist in understanding the rules and relationships discovered by
the models (see the “Algorithms” section in this chapter).

Model Evaluation
Evaluate the trained models to determine which model most reliably predicts the outcome
and to decide whether the accuracy is adequate to meet business goals. The mining
accuracy chart view provides tools for performing the evaluation.

You can enable the charts visible within this view by supplying data for testing under the
Input Selection tab. Choose one of three sources:

■ (^) Use mining model test cases: Uses test data held out in the mining structure, but
applies model fi lters.
■ (^) Use mining structure test cases: Uses test data held out in the mining structure,
ignoring any model fi lters.
■ (^) Specify a different data set: Enables the selection and mapping of an external
table to supply test data. After selecting this option, press the ellipsis to display
the Specify Column Mapping dialog, and select the table with the test data by
clicking Select Case Table.
If the value predicted is discrete, the Input Selection tab also enables choosing a particular
outcome for evaluation; otherwise, all outcomes are evaluated.
Lift Charts and Scatter Plots
When the source data and any Predict Value have been specifi ed, switch to the Lift Chart
tab, and select Lift Chart from the Chart Type list, as shown in Figure 57-2. The lift
chart can compare the predicted against the actual outcomes by showing the percentage
of correct cases (Target Population percent) versus the percentage of cases tested (Overall
Population percent). An Ideal Model and Random Guess line displays to plot the best pos-
sible and random outcomes, respectively.
c57.indd 1286c57.indd 1286 7/31/2012 10:35:02 AM7/31/2012 10:35:02 AM
http://www.it-ebooks.info

Free download pdf