In closing this chapter, let me briefly return to the issue of lexical decomposition raised in section 11.2. It should be
clear by now that the generalizations of word meaning cannot be studied without a theory of lexical decomposition.
The kind of decomposition required, however, is not a simple list of necessary and sufficient features of the sort
envisioned by Tarski and Fodor. Rather, it is a richly textured syste mwhose subtleties we are only beginning to
appreciate: division into CS and SpS components, conditions that shade away from focal values, conditions that
operate in a cluster system, features that cut across semanticfields, Function-argument structure, qualia structure, and
dot-object structure. It does remain to be seen whether all this richness eventuallyboils down to a system built from
primitives, or if not, what alternative there may be. And it does remain to be seen whether lexical meaning can be
exhaustively constituted by the techniques discussed here. But even if the ultimate answers are not in sight, there is
certainly a sense of progress since the primitive approaches of the 1960s.
It should be recognized that there are fundamental methodological and expository difficulties in doing lexical
semantics. What does it take to be properly systematic? It is all too easy to build an industry on the endless details of a
single word; good examples arebeliefandknowledgein the traditional philosophical literature andoverin the cognitive
linguistics tradition. The unfortunate result is that one loses sight of the goal of a systematic account of thepatternsof
meaning. Alternatively, onecanlookfor thepatternsbycoveringa broad rangeofwordssomewhatless deeply;butthe
result is all too often a tiring list, impossible for anybut the most dedicated reader to assimilate. Alas, Pinker's (1989)
study of verb frames and much of Anna Wierzbicka's work (e.g. 1987), although amazingly clever and insightful, tend
to fall prey to this problem; the present chapter may as well. Perhaps there is no way out: there are just so many
goddamned words, and so many parts to them.
On theother hand, these difficultiesinthemselvespointout oneofthefundamentalmessages ofgenerative linguistics:
We language users know so much. And henceas children we learned so much—starting with some innate conceptual basis of
unknown richness. Next to lexical semantics, the acquisition problem for grammar pales by comparison.