In particular-and this comes back to a major theme of Chapters XI
and XII-the representation of the r~al world in the brain, although
rooted in isomorphism to some extent, involves some elements which have
no counterparts at all in the outer world. That is, there is much more to it
than simple mental structures representing "dog", "broom", etc. All of
these symbols exist, to be sure-but their internal structures are extremely
complex and to a large degree are unavailable for conscious inspection.
Moreover, one would hunt in vain to map each aspect of a symbol's internal
structure onto some specific feature of the real world.
Processes That Are Not 50 5kimmable
For this reason, the brain begins to look like a very peculiar formal system,
for on its bottom level-the neural level-where the "rules" operate and
change the state, there may be no interpretation of the primitive elements
(neural firings, or perhaps even lower-level events). Yet on the top level,
there emerges a meaningful interpretation-a mapping from the large
"clouds" of neural activity which we have been calling "symbols", onto the
real world. There is some resemblance to the Codel construction, in that a
high-level isomorphism allows a high level of meaning to be read into
strings; but in the Codel construction, the higher-level meaning "rides" on
the lower level-that is, it is derived from the lower level, once the notion of
Codel-numbering has been introduced. But in the brain, the events on the
neural level are not subject to real-world interpretation; they are simply not
imitating anything. They are there purely as the substrate to support the
higher level, much as transistors in a pocket calculator are there purely to
support its number-mirroring activity. And the implication is that there is
no way to skim off just the highest level and make an isomorphic copy in a
program; if one is to mirror the brain processes which allow real-world
understanding, then one must mirror some of the lower-level things which
are taking place: the "languages of the brain". This doesn't necessarily
mean that one must go all the way down to the level of the hardware,
though that may turn out to be the case.
In the course of developing a program with the aim of achieving an
"intelligent" (viz., human-like) internal representation of what is "out
there", at some point one will probably be forced into using structures and
processes which do not admit of any straightforward interpretations-that
is, which cannot be directly mapped onto elements of reality. These lower
lc!yers of the program will be able to be understood only by virtue of their
catalytic relation to layers above them, rather than because of some direct
connection they have to the outer world. (A concrete image of this idea was
suggested by the Anteater in the Ant Fugue: the "indescribably boring
nightmare" of trying to understand a book on the letter level.)
Personally, I would guess that such multilevel architecture of
concept-handling systems becomes necessary just when processes involving
images and analogies become significant elements of the program-in
(^570) Church, Turing, Tarski, and Others