Foundations of Cognitive Psychology: Preface - Preface

(Steven Felgate) #1

suppose that they were alive (even though the basis of their life was not DNA,
but some functionally similar self-replicating molecule) and that they even look
like people. And suppose further not only that their brains were constructed of
elements that are functionally similar to neurons, but also that these elements
were interconnected in just the way that neurons in our brains are. Indeed,
their brains would be functionally isomorphic to ours, even though they were
made of physically different stuff.
Functionalists then claim that these alien creatures would have the same
mental states as we do—that is, the same perceptions, pains, desires, beliefs,
and so on that populate our own conscious mental lives—provided that their
internal states were analogously related to each other, to the external world,
and to their behavior. This same approach can be generalized to argue for the
possibility that computers and robots of the appropriate sort would also be
conscious.Suppose,forexample,thateachneuroninabrainwasreplacedwith
a microcomputer chip that exactly simulated its firing patterns in response to
all the neuron chips that provide its input. The computer that was thus con-
structed would fulfill the functionalist requirements for having the same mental
states as the person whose brain was ‘‘electronically cloned.’’ You should de-
cide for yourself whether you believe that such a computer would actually
have mental states or would merely act as though it had mental states. Once
you have done so, try to figure out what criteria you used to decide. (For two
contradictory philosophical views of this thought experiment, the reader is re-
ferred to Dennett (1991) and Searle (1993).)
Multiple realizability is closely related to differences between the algorithmic
and implementation levels. The algorithmic level corresponds roughly to the
functional description of the organism in terms of the relations among its in-
ternal states, its input information, and its output behavior. The implementa-
tion level corresponds to its actual physical construction. The functionalist
notion of multiple realizability thus implies that there could be many different
kinds of creatures that would have the same mental states as people do, at least
defined in this way. If true, this would undercut identity theory, since mental
events could not then be simply equated with particular neurological events;
theywouldhavetobeequatedwithsomemoregeneralclassofphysicalevents
that would include, among others, silicon-based aliens and electronic brains.
The argument from multiple realizability is crucial to the functionalist theory
of mind. Before we get carried away with the implications of multiple realiz-
ability, though, we must as kourselves whether it is true or even remotely li kely
to be true. There is not much point in basing our understanding of conscious-
ness on a functionalist foundation unless that foundation is well grounded. Is
it? More important, how would we know if it were? We will address this topic
shortly when we consider the problem of other minds.


Supervenience There is certainly some logical relation between brain activity
and mental states such as consciousness, but precisely what it is has obviously
been difficult to determine. Philosophers of mind have spent hundreds of years
trying to figure out what it is and have spilled oceans of in kattac king and
defending different positions. Recently, however, philosopher Jaegwon Kim
(1978, 1993) has formulated a position with which most philosophers of mind


10 Stephen E. Palmer

Free download pdf