The Turing Guide

(nextflipdebug5) #1

302 | 28 TURING’S CONCEPT Of  INTEllIGENCE


of behavioural patterns’ according to which a machine can think if and only if its behaviour is
‘indistinguishable from that of a human being’.^4
Behaviourism was popular in the 1950s, but even then faced objections. A 1950s philosoph-
ical joke went like this: ‘One behaviourist meeting another on the street said “You feel fine!
How do I feel?” ’.^5 Surely, the joke implies, I do not learn how I feel only from third-person
observations of my behaviour (Fig. 28.1). There is an array of such difficulties for crude behav-
iourist theories. If you don’t say anything, does it follow that you don’t feel anything? How is the
behaviourist to explain mental images, voices in the head, pains, or tastes? What causes ‘think-
ing’ behaviour, if not an inner ghost? Mays criticized behaviourism for ignoring ‘the evidence
of my own introspections, as well as those of other people, that there are such things as private
psychological events, however heretical such a view may seem to-day’. He said:


[T]he machine analogy, with its emphasis on overt behaviour and abnegation of private experi-
ence may . . . lead [human beings] to be regarded, more than ever before, as if we were mechani-
cal objects. It is not such a far cry from Aristotle’s view that slaves were just human tools, to some
future benevolent dictatorship of the Orwell 1984 type, where men may be seen as little else
but inefficient digital computors and God as the Master Programmer.


According to Mays, on Turing’s criterion of thinking, ‘the meaning of the word “thinking” has
changed to such an extent that it has little in common with what we normally mean by it’.^6
Mays acknowledged that a machine ‘could be constructed with the facility for performing
intelligence tests’; but on the crucial question of whether the machine is intelligent, he said
‘What is important is not what it does, but how it does it’.^7 His assumption—that Turing’s imi-
tation game tests a machine’s behaviour—still underlies influential objections to the Turing
test. Critics have invented numerous (imaginary) counter-examples; for example, a program
that functions just by means of a huge lookup table or a human ‘zombie’ that is behaviourally
indistinguishable from other human beings but lacks consciousness. These entities would pass
the Turing test but how they do so is not what we mean by ‘thinking’, critics claim.


figure 28.1 A recent version of the 1940s behaviourist joke.
Reprinted with permission of Stephen L. Campbell.
Free download pdf