BIOLOGICAL INSPIRATION FOR COMPUTING 261
the species as a whole. By contrast, computer system monoculture (i.e., lack of diversity) implies that
systems share vulnerabilities, and a successful attack on one system is likely to succeed on other
systems as well.^46
- Autonomous, in the sense that it classifies and eliminates pathogens and repairs itself by replacing
damaged cells without the benefit of any centralized control mechanism. Given the growing security
burden placed on today’s computer systems and networks, it will be increasingly desirable for these
system and networks to manage security problems with minimal human intervention. - Tolerant of error, in the sense that some mistakes in identification of pathogens (false positives or
false negatives) are not generally fatal and do not cause immune system collapse, although they can
cause lingering autoimmune disease. Such tolerance is in part the result of a multilayered design of the
immune system, in which multiple, independently architected layers of defense (“defense in depth”)
operate to provide levels of protection that are not achievable by any single mechanism.^47 Computer
systems are often not so tolerant, and small errors or problems in some part of a system can lead to
significant malfunctions. - Dynamic, in the sense that pathogen detectors are continually being produced to replace those
that are (routinely) destroyed. These detectors, circulated through the body, provide whole-body pro-
tection and may be somewhat different in each new generation (in that they respond to different
pathogens). Because these detectors turn over, the immune system has a greater potential coverage. By
contrast, protection against computer viruses, for example, is based on the notion that all threat viruses
are known—and most antiviral systems are unable to cope with a new virus for which no signature is
known. - Capable of remembering (adaptable), in the sense that the immune system can learn about new
pathogens and “remember” how it coped with one pathogen in order to respond more effectively to a
future encounter with the same or a similar pathogen. It can also “forget” about nonself entities that are
incorporated into the body (e.g., food gets turned into body parts). Computer systems must also adapt
to new environments, as for example, when new software is added legitimately, as well as identify new
threats. - Imperfect, in the sense that individual pathogen detectors do not identify pathogens perfectly, but
rather respond to a variety of pathogens. Greater specificity is obtained through redundant detection of
a pathogen using different detector types. By contrast, computer security systems that look for precise
signatures of intruders (e.g., viruses) are easily circumvented. - Redundant, in the sense that multiple and different immune system detectors can recognize a
pathogen. Pathogens generally contain many parts, called epitopes, that are recognized by immune
system detectors; thus, failure to recognize one epitope is not fatal because many others are available for
recognition. - Homeostatic, in the sense that the immune system can be regarded as one mechanism through
which the human body seeks to maintain a stable internal state despite a changing environment. A
computer system can be designed to autonomously monitor its own activities, routinely making small
corrections to maintain itself in a “normal” state, even in the face of wide variations in inputs, such as
those caused by intruders.^48
At a deeper level, it is instructive to ask whether the particular methods by which the immune
system achieves these characteristics (implements these design principles) have potential relevance to
computer security. To address this issue, deeper and more detailed immunological knowledge is neces-
sary, but some work has been done in this area and is described below.
(^46) For more discussion of this point, see Computer Science and Telecommunications Board, National Research Council, Com-
puters at Risk: Safe Computing in the Information Age, National Academy Press, Washington, DC, 1991.
(^47) This point suggests that detection mechanisms are biased to be more tolerant of false negatives than false positives, because
threats that are unaffected by one layer (i.e., false negatives) might well be intercepted by another.
(^48) A. Somayaji and S. Forrest, “Automated Response Using System Call Delays,” Journal of Computer Security 6:151-180, 1998.