410 Chapter 9: Regression
which yields, upon replacingσby its estimator, that
Y(x)−∑k
i= 0Bixi
√
SSR
(n−k−1)√
1 +x′(X′X)−^1 x∼tn−k− 1We thus have:
Prediction Interval forY(x)With 100(1−a) percent confidenceY(x) will lie between
∑ki= 0xibi±√
ssr
(n−k−1)√
1 +x′(X′X)−^1 x ta/2,n−k− 1whereb 0 ,...,bkare the values of the least squares estimatorsB 0 ,B 1 ,...,Bk, andssris
the value ofSSR.
EXAMPLE 9.10e If in Example 9.10d we were interested in determining an interval in which
a single steel sheet, produced with a carbon content of .15 percent and at an annealing
temperature of 1,150◦F, would lie, then the midpoint of the prediction interval would
be as given before. However, the half-length of this prediction interval would differ from
the confidence interval for the mean value by the factor
√
1.313/√
.313. That is, the
95 percent prediction interval is
69.862±8.363 ■9.11Logistic Regression Models for Binary Output Data
In this section we consider experiments that result in either a success or a failure. We will
suppose that these experiments can be performed at various levels, and that an experiment
performed at levelxwill result in a success with probabilityp(x),−∞<x<∞.Ifp(x)
is of the form
p(x)=ea+bx
1 +ea+bxthen the experiments are said to come from alogistic regressionmodel andp(x) is called
thelogistics regression function.Ifb>0, thenp(x)=1/[e−(a+bx)+ 1 ]is an increasing
function that converges to 1 asx→∞;ifb<0, thenp(x) is a decreasing function that
converges to 0 asx→∞. (Whenb=0,p(x) is constant.) Plots of logistics regression
functions are given in Figure 9.21. Notice the s-shape of these curves.