Foundations of Language: Brain, Meaning, Grammar, Evolution

(ff) #1

CHAPTER 7 Implications for Processing


Chapter 2 touched on the potential connection and interpenetration between theories of competence—linguistic
structure—and performance—language processing. Inthelast chapterwebegan exploring thisissue, concentrating on
the storage of linguistic elements. This chapter goes further toward a rapprochement of theoretical linguistics and
psycholinguistics. It concerns itself with the job of the language processor: how stored pieces are used online to build
combinatorial linguistic structures in working memory during speech perception and production.


We begin byshowinghowthearchitecture proposed inChapter5 translates intoa processingmodel, withtheinterface
components playing a crucial role.In particular, the treatment of the lexicon in the parallelarchitecture turns out tofit
nicelyintoanalysesof lexical accessinperceptionand production. Wethentakeupsomemoregeneral questionsabout
the role of modularity in processing. The overall goal is to show that the parallel architecture offers a theoretical
perspective that unifies linguistics with psycholinguistics more satisfactorily than has been previously possible.


7.1 The parallel competence architecture forms a basis for a processing architecture


To review the situation so far: The standard architecture for generative grammar, fromAspectsto the Minimalist
Program, conceivesofgrammar as syntactocentricand derivational.Thegenerativecapacityof languagecomesentirely
from syntax. Linguistic structure, complete with lexical items, is built by an initial stage of syntactic derivation—-
D(eep)-structure inAspectsthrough GB, and the operation Merge in the Minimalist Program. Then derivational rules
of movement and deletion produce levels of syntactic structure that are subjected to phonological and semantic
interpretation.Phonologicaland semanticstructures are outputs ofa syntacticderivation,withnosignificantgenerative
capacities of their own.

Free download pdf