Unless a person designed himself and chose his own wants (as well
as choosing to choose his own wants, etc.), he cannot be said to
have a will of his own.
It makes you pause to think where your sense of having a will comes
from. Unless you are a soulist, you'll probably say that it comes from your
brain-a piece of hardware which you did not design or choose. And yet
that doesn't diminish your sense that you want certain things, and not
others. You aren't a "self-programmed object" (whatever that would be),
but you still do have a sense of desires, and it springs from the physical
substrate of your mentality. Likewise, machines may someday have wills
despite the fact that no magic program spontaneously appears in memory
from out of nowhere (a "self-programmed program"). They will have wills
for much the same reason as you do-by reason of organization and
structure on many levels of hardware and software. Moral: The Samuel
argument doesn't say anything about the differences between people and
machines, after all. (And indeed, will will be mechanized.)
Below Every Tangled Hierarchy Lies An Inviolate Level
Right after the Two-Part Invention, I wrote that a central issue of this book
would be: "Do words and thoughts follow formal rules?" One major thrust
of the book has been to point out the many-Ieveledness of the mindlbrain,
and I have tried to show why the ultimate answer to the question is,
"Yes-provided that you go down to the lowest level-the hardware-to
find the rules."
Now Samuel's statement brought up a concept which I want to pursue.
It is this: When we humans think, we certainly do change our own mental
rules, and we change the rules that change the rules, and on and on-but
these are, so to speak, "software rules". However, the rules at bottom do not
change. Neurons run in the same simple way the whole time. You can't
"think" your neurons into running some nonneural way, although you can
make your mind change style or subject of thought. Like Achilles in the
Prelude, Ant Fugue, you have access to your thoughts but not to your
neurons. Software rules on various levels can change; hardware rules
cannot-in fact, to their rigidity is due the software's flexibility! Not a
paradox at all, but a fundamental, simple fact about the mechanisms of
intelligence.
This distinction between self-modifiable software and inviolate
hardware is what I wish to pursue in this final Chapter, developing it into a
set of variations on a theme. Some of the variations may seem to be quite
far-fetched, but I hope that by the time I close the loop by returning to
brains, minds, and the sensation of consciousness, you will have found an
invariant core in all the variations.
My main aim in this Chapter is to communicate some of the images
which help me to visualize how consciousness rises out of the jungle of
neurons; to communicate a set of intangible intuitions, in the hope that
(^686) Strange Loops, Or Tangled Hierarchies