that depend on interactions between genes
and their products.
I have come to disagree. Although complex
pathways and interactions certainly exist
in living cells, unless we have evidence that
their presence is needed to account for the
data we see, we should eliminate them from
our models: otherwise, we risk filling them
with experimental noise. Together with my
colleagues Katharina Nöh and Axel Theorell
at the Jülich Research Centre in Germany,
I am part of a team developing tools that
apply the razor to slice through millions
of candidate models of metabolism to find
the simplest that work.
Truth in simplicity
My debate with Westerhoff and others
continues, but my exploration of the impact
of William’s logic has convinced me that
Occam’s razor isn’t just a tool of science – it
is science. Whether we are building bridges
using Newtonian mechanics or employing
our understanding of the genetic code to make
covid-19 vaccines, science is essentially the
search for the simplest models. To find them,
and develop concepts and technologies from
them, we use additional tools, such as
experimentation, mathematics and logic.
But none of these tools is unique to science.
Cooks experiment with new recipes, just as
musicians experiment with harmonies, while
mathematics and logic are as essential to
accountants as they are to physicists. And using
the tools doesn’t make something a science.
Despite centuries of experimentation, alchemy
didn’t develop into a science because its
“theories” were junkyards of entities beyond
necessity. Astrologers have wasted centuries
using mathematics to make useless predictions.
Some people cite Karl Popper’s
“falsifiability” criterion – that scientific
theories can be disproved – as what
distinguishes science from, say, religion.
But as well as being equally applicable to many
human activities, such as law, falsifiability
doesn’t work. It is as impossible to disprove as
to prove a hypothesis. The best we can do is
compare the probabilities of rival hypotheses.
And it is with that sort of thing that simplicity,
as embodied in Occam’s razor, has always
provided the best way. ❚
of fashion in today’s world of large data sets.
That is even though it is embedded in one of
our most powerful tools for dealing with them,
one that is increasingly seen as fundamental
to data-driven science: Bayesian inference.
Invented by Thomas Bayes – another
religious man, this time a Presbyterian
minister – in the 18th century, Bayesian
inference is a tool that allows us to update a
prior belief in a model, theory or explanation
as new information comes in. Think of two
dice, one six-sided and the other 60-sided.
Suppose I throw one of these. I don’t tell you
which, but reveal that it landed on a 4. You still
don’t know which die I threw, but Bayesian
inference provides a mathematical framework
through which you can state that it is far more
likely to have been the six-sided die (10 times as
likely, in fact) – purely because there are many
more other numbers that the 60-sided die
could have produced.
Simple theories or models, such as
Copernicus’s heliocentric solar system,
are like the six-sided die: they make sharp
predictions. Complex theories or models,
such as Ptolemy’s model of everything orbiting
our planet, are like the 60-sided die, making
looser predictions that can fit a wider range
of data. When we acquire information that
fits with both simple and complex models,
Bayesian inference, a mathematical
embodiment of Occam’s razor, urges us
to accept the simpler option because it is
more likely to be the source of the data.
That goes against a grain in my own field of
systems biology, which deals with modelling
complex biological systems. The discipline has
come of age since 2000, when the first draft
of the human genome sequence was unveiled.
At first, the promised new era of medicine
informed by the knowledge of our genome
seemed slow in coming. The finger of blame
was pointed at the way biologists treated
genes in isolation, rather than as components
of complex, dynamic systems. My field
galloped to the rescue by providing complex
mathematical models of multiple genes and
their innumerable interactions. But a problem
then arose: where do you stop? Should models
include 10 genes, 100, 1000, 10,000 – or the
entire human genome?
My own interest in Occam’s razor was
piqued about 10 years ago, when one of the
founders of systems biology, my colleague and
friend Hans Westerhoff, presented a seminar at
Surrey entitled “No Occam’s razor for systems
biology”. He argued that models of life’s
workings needed to be as complex as possible
to capture the high-level emergent properties
Johnjoe McFadden is author of
Life Is Simple, a book on Occam’s
razor. He is still cutting edge
18/25 December 2021 | New Scientist | 71
“ Occam’s razor
isn’t just a tool
of science – it
is science”