New Scientist - USA (2019-06-15)

(Antfer) #1

6 | New Scientist | 15 June 2019


THE salamanders of Canada have
an unlikely major predator: the
pitcher plant. A survey between late
August and mid-September 2018
revealed that a fifth of the pitchers
in one bog in Ontario’s Algonquin
Provincial Park had caught at
least one juvenile salamander.
“That was a WTF moment,”
says Alex Smith at the University
of Guelph in Canada. His team
estimates that the pitchers may
kill up to 5 per cent of the area’s
juvenile salamanders.
Pitcher plants are famous for
feeding on insects. The juvenile
salamanders might be falling in
by accident or entering to feed
on trapped insects (The Scientific
Naturalist, doi.org/c63x).
The team now plans to confirm
whether the plants do feed on them,
rather than being overwhelmed by
the glut of nutrients. Large tropical
pitchers occasionally catch rodents
and birds, but this seems rare. ❚

Ecology

ARTIFICIAL intelligence is an
energy-intensive technology.
New estimates suggest that the
carbon footprint of creating a
single AI is equivalent to as much
as 284 tonnes of carbon dioxide,
five times the lifetime emissions
of an average car.
Emma Strubell at the University
of Massachusetts Amherst and
her colleagues have assessed the
energy consumption required
to train four large neural networks
used for processing language.
Language-processing AIs
underpin the algorithms that
power Google Translate as well as
text generators, which can write
fake news articles when given a
few lines of text (see page 15).

These AIs are trained via
deep learning, which involves
processing vast amounts of data.
“In order to learn something as
complex as language, the models
have to be large,” says Strubell.
A common approach involves
giving an AI billions of written
articles to show it the meaning
of words and how sentences are
constructed. To measure the
environmental impact of this
approach, Strubell’s team trained
four different AIs – Transformer,
ELMo, BERT and GPT-2 – for
a day each, and sampled their
energy consumption.
They calculated the total power
required to train each AI by
multiplying this figure by the total

training time reported by
the developers of each model.
A carbon footprint was then
estimated based on the average
carbon emissions used in power
production in the US.
A process called neural
architecture search, which

produces accurate AIs by
automating the design, was
particularly energy-intensive
and time-consuming. Training
Transformer without this process
takes 84 hours, but more than

270,000 hours with it, requiring
3000 times the amount of energy.
Such training is split over dozens
of chips, so takes months to
complete rather than years.
The inefficiency stems from
the need to fine-tune the model
for very specific tasks, such as
translating from one language
to another, says Strubell.
Big tech firms such as Amazon
and Google offer cloud-based
platforms that researchers can
pay to use remotely for training
AIs. To get a more accurate
picture of the associated carbon
footprint, the analysis would have
to account for the actual energy
mix these companies use. ❚

Technology

AI needs more energy than five cars


Michael Le Page

Salamander-eating plants


Pitcher plants in Canada seem to regularly dine on young amphibians


PA
TR
ICK

D.
MO

LD

OW

AN

News


Donna Lu

284
The number of tonnes of CO 2 that
can be emitted by training an AI
Free download pdf