100 JAMES HUTTON
have always been the same, and
therefore the clues to the past lie
in the present. However, while
Hutton’s insights concerning the
antiquity of the planet rang true
to geologists, there was still no
satisfactory method of determining
just how old the planet was.
An experimental approach
Since the end of the 18th century,
scientists had recognized that
Earth’s crust comprises successive
layers of sedimentary strata.
Geological mapping of these strata
revealed that cumulatively they are
very thick and many contain the
fossil remains of the organisms
that lived in their respective
depositional environments. By
the 1850s, the geological column
of strata (also known as the
stratigraphic column) had been
more or less carved up into some
eight named systems of strata and
fossils, each of which represented
a period of geological time.
Geologists were impressed by
the overall thickness of the strata,
estimated to be 16–70 miles
(25–112 km) thick. They had
observed that the processes of
erosion and deposition of the rock
materials that make up such strata
were very slow—estimated to be
a few inches (centimeters) every
100 years. In 1858, Charles Darwin
made a somewhat ill-judged foray
into the debate when he estimated
that it had taken some 300 million
years for erosion to cut through the
Tertiary and Cretaceous period
rocks of the Weald in southern
England. In 1860, John Phillips,
a geologist at Oxford University,
estimated that Earth is about
96 million years old.
But in 1862, such geological
calculations were scorned by the
eminent Scottish physicist William
Thomson (Lord Kelvin) for being
unscientific. Kelvin was a strict
empiricist and argued that he
could use physics to determine
an accurate age for Earth, which
he thought was constrained by
the age of the Sun. Understanding
of Earth’s rocks, their melting
points and conductivity, had vastly
improved since Buffon’s day. Kelvin
Lord Kelvin pronounced the world to
be 40 million years old in 1897, the year
in which radioactivity was discovered.
He did not know that radioactive decay
in Earth’s crust provides heat that
greatly slows the rate of cooling.
took Earth’s initial temperature at
7,000°F (3,900°C) and applied the
observation that temperature
increases as you go downward from
the surface—by about 1°F (0.5°C)
over every 50 ft (15 m) or so. From
this, Kelvin calculated that it had
taken 98 million years for Earth to
cool to its present state, which he
later reduced to 40 million years.
A radioactive “clock”
Such was Kelvin’s prestige that
his measure was accepted by most
scientists. Geologists, however,
were left feeling that 40 million
years was simply not long enough
for the observed rates of geological
processes, accumulated deposits,
and history. However, they
had no scientific method with
which to contradict Kelvin.
In the 1890s, the discovery
of naturally occurring radioactive
elements in some of Earth’s
minerals and rocks provided the
key that would resolve the impasse
between Kelvin and the geologists,
since the rate at which atoms
decay makes a reliable timer.
In 1903, Ernest Rutherford
predicted rates of radioactive
decay and suggested that
radioactivity might be used as
a “clock” to date minerals and
the rocks that contain them.
In 1905, Rutherford obtained
the very first radiometric dates
of formation for a mineral from
Glastonbury, Connecticut: 497–500
million years. He warned that these
were minimum dates. In 1907,
American radiochemist Bertram
Boltwood improved on Rutherford’s
technique to produce the first
radiometric dates of minerals in
rocks with a known geological
context. These included a
2.2-billion-year-old rock from
Sri Lanka, whose age increased
previous estimates by an order
The mind seemed to
grow giddy by looking so
far into the abyss of time.
John Playfair