8.7 Exercises 111
assume that such a hypothesis can be calculated given theseO(n) examples
in timeO(n), and that the empirical risk of each such hypothesis can be
evaluated in timeO(mn). For example, ifHnis the class of axis aligned
rectangles inRn, we saw that it is possible to find an ERM hypothesis in the
realizable case that is defined by at most 2nexamples. Prove that in such
cases, it is possible to find an ERM hypothesis forHnin the unrealizable case
in timeO(mnmO(n)).
- In this exercise, we present several classes for which finding an ERM classi-
fier is computationally hard. First, we introduce the class ofn-dimensional
halfspaces,HSn, for a domainX=Rn. This is the class of all functions of
the formhw,b(x) = sign(〈w,x〉+b) wherew,x∈Rn,〈w,x〉is their inner
product, andb∈R. See a detailed description in Chapter 9.
- Show that ERMHover the classH=HSnof linear predictors is compu-
tationally hard. More precisely, we consider the sequence of problems in
which the dimensionngrows linearly and the number of examplesmis set
to be some constant timesn.
Hint: You can prove the hardness by a reduction from the following prob-
lem:
Max FS:Given a system of linear inequalities,Ax>bwithA∈Rm×nandb∈
Rm(that is, a system ofmlinear inequalities innvariables,x= (x 1 ,...,xn)),
find a subsystem containing as many inequalities as possible that has a solution
(such a subsystem is calledfeasible).
It has been shown (Sankaran 1993) that the problem Max FS is NP-hard.
Show that any algorithm that finds an ERMHSnhypothesis for any training
sampleS∈(Rn×{+1,− 1 })mcan be used to solve the Max FS problem of
sizem,n.Hint: Define a mapping that transforms linear inequalities inn
variables into labeled points inRn, and a mapping that transforms vectors
inRnto halfspaces, such that a vectorwsatisfies an inequalityqif and
only if the labeled point that corresponds toqis classified correctly by the
halfspace corresponding tow. Conclude that the problem of empirical risk
minimization for halfspaces in also NP-hard (that is, if it can be solved in
time polynomial in the sample size,m, and the Euclidean dimension,n,
then every problem in the class NP can be solved in polynomial time).
- LetX=Rnand letHnkbe the class of all intersections ofk-many linear
halfspaces inRn. In this exercise, we wish to show that ERMHnkis com-
putationally hard for everyk≥3. Precisely, we consider a sequence of
problems wherek≥3 is a constant andngrows linearly. The training set
size,m, also grows linearly withn.
Towards this goal, consider thek-coloring problem for graphs, defined as
follows:
Given a graphG= (V,E), and a numberk, determine whether there exists a
functionf:V→{ 1 ...k}so that for every (u,v)∈E,f(u) 6 =f(v).
Thek-coloring problem is known to be NP-hard for everyk≥3 (Karp
1972).