desperately to create situations that would make Deep Junior uncomfortable
(whatever that means!). The human player would anticipate future occur-
rences and get surprised or feel push-backs (i.e., counter-moves),but the ma-
chine, like an autistic savant, was totally immersed in its own monologue of
calculation. Kasparov got tired and Deep Junior never did.
Despite the marvelous achievement of artificial intelligence in the second
half of the 20th century, several limitations of Deep Junior are quite striking.
The programmers of Deep Junior still felt that they had to intervene regard-
ing a draw offer by Kasparov instead of allowing the machine to make a deci-
sion on its own (e.g., setting a fixed threshold point in evaluation for rejecting
or accepting a draw offer). The learning ability of Deep Junior, if any, is very
limited. After each game, the programmers of Deep Junior had to serve as a
metalevel control and fine-tune the machine based on the information from
the previous games. When all is said and done, Deep Junior was still a data-
crunching program, executing instructions as it had been programmed to do.
What lessons can we learn from this human–machine comparison? For
decades in the early 20th century, we did not have a proper language to de-
scribe what is going on inside the black box of the human mind. The emer-
gence of the computer changed things, giving rise to the metaphor of the
mind as an information processing device (Baars, 1986). The computer meta-
phor has given us a powerful language to describe how the mind might work.
Ironically, a half century later, the unfolding of artificial intelligence gave us
a new window through which to look back at the human mind and human in-
telligence. It became clear, based on the previous comparisons, that human
intellectual functioning and development^1 are subject to a different set of con-
straints compared to machine intelligence.
Limitations of Cognitivism
The computer metaphor provides an approximation of the mind to a certain
point. After all, the designers of the standard computer clearly attempted to
mimic the way humans process information (von Newmann, 1958). How-
ever, when the mind is reduced to merely a symbolic processing device, we get
4 DAI AND STERNBERG
1 1 The term intellectual functioning is often used to refer to complex, higher-order forms of
cognition such as reasoning, problem solving, and decision making. We use the term to denote:
(a) any act of generating or utilizing knowledge or strategies, or both, for practical or purely in-
tellectual purposes by an intentional system; and (b) the effectiveness of such an act in achieving
specific desired outcomes. Defined as such, it distinguishes itself from mere cognitive operations.
In other words, intellectual functioning and cognitive functioning belong to two levels of analy-
sis; the former is at the intentional level, and latter is at the operational level, to use the terminol-
ogy of activity theory (Leont’ev, 1978; see also Oerter, 2000). Defined as such, intellectual func-
tioning subsumes, but cannot be reduced to, cognitive functioning.