- seCtIon FoUR: eVoLUtIon
This required a re-entrant mapping in which the
output from speech production was internally
streamed as input to understanding. Steels
(2003) argues that this is not only comparable
with re-entrant systems in the human brain but
also explains why we have such persistent inner
voices chattering away to ourselves. This ‘inner
voice’, he suggests, contributes to our self-model
and is part of our conscious experience.
Would imitating robots, or artificial meme
machines, then invent self-reference, with words
for ‘I’, ‘me’, and ‘mine’? If so, a centre of narrative
gravity would form (Dennett, 1991), and the
machines would become deluded into thinking
they were an experiencing self. Similarly, the
memes they copied might gain a replication
advantage by being associated with the words ‘I’,
‘me’, and ‘mine’, and so a selfplex would form, with beliefs, opinions, desires, and
possessions, all attributed to a non-existent inner self.
This approach implies that machines capable of imitation would be qualitatively
different from all other machines, in the same way that humans differ from most
other biological species. Not only would they be capable of language, but their
ability to imitate would set off a new evolutionary process – a new machine cul-
ture. Early research with groups of very simple imitating robots is already explor-
ing the emergence of artificial culture in robot societies (Winfield and Griffiths,
2010). One question for the future would be whether we and the new imitation
machines would share a common expanded culture or whether they would imi-
tate in ways that we could not follow. Either way, they would be conscious for
the same reason we are: because they have constructed a false notion of self as
the subject experiencing a stream of consciousness. They would become deluded
machines believing there was something it’s like to be them.
I’M SURE IT LIKES ME
When Tamagotchis hit the playgrounds in the mid-1990s, children all over the
world starting caring for mindless little virtual animals, portrayed on tiny, low-
resolution screens in little hand-held plastic boxes. These young carers took time
to ‘wash’ and ‘feed’ their virtual pets, and cried when they ‘died’. Soon the craze was
over. The Tamagotchi meme had thrived on children’s caring natures, but then
largely fizzled out, perhaps because the target hosts quickly became immune to
such a simple trick. More recently, people have got just as hooked on using their
phones to find and fight battles with 3D animals lurking in real environments,
with stories of players falling off cliffs and wandering into former concentration
camps in search of the Pokémon GO creatures.
We humans seem to adopt the intentional stance towards other people, animals,
toys, machines, and digital entities on the flimsiest of pretexts. This tactic of
attributing mental states to other systems can be valuable for understanding or
interacting appropriately with them, but is not an accurate guide to how those
‘Robots that imitated
humans would acquire
an illusion of self and
consciousness just as
we do’
(Blackmore, 2003, p. 19)
FIGURE 12.16 • The ‘talking heads’ are robots
that imitate each other’s sounds
while looking at the same
object. From this interaction,
words and meanings
spontaneously emerge. Could
human language have emerged
the same way? Does the
robots’ use of meaningful
sounds imply consciousness?