New Scientist - USA (2022-03-19)

(Maropa) #1
20 | New Scientist | 19 March 2022

Technology

Matthew Sparkes

ARTIFICIAL intelligence is
growing ever more capable
at increasingly complex tasks,
but it is intensive to develop.
A more efficient technique
could save up to half the time,
energy and computing power
needed to train an AI model.
Deep learning models are
typically composed of a huge
grid of artificial neurons linked
by “weights” – computer code
that takes an input and passes
on a changed output – that
represent synapses linking
real neurons. By tinkering with
these weights over thousands or
millions of trials, it is possible to
gradually train a model to carry
out a task, such as identifying
a person from a picture of
their face or digitising text
from an image of handwriting.
This training usually relies on
an iterative process of passing
data in, assessing the quality of
the output and then calculating
a gradient that informs how
the weights should be altered
to improve performance. This
involves passing data from one
side of the neural network to the
other, via every link in the chain
of artificial neurons, and then

working back to the beginning
to calculate the gradient.
Atılım Güneş Baydin at the
University of Oxford and his
colleagues have now taken
this two-stage process, known
as back-propagation, and
reduced it to just one, where an
approximation of the gradient
close enough to be effective is
calculated during the first pass,
making the second redundant.
In theory, it could slash the time
needed to train AI models in

half. The team ran numerous
tests with back-propagation and
their new approach, each for the
same number of iterations, and
found that the performance
of the AI was comparable
(arxiv.org/abs/2202.08587).
Andrew Corbett at the
University of Exeter, UK, says
that calculating the gradient
in the forward pass is “a simple
mathematical trick” but has
the potential to solve one of
the largest problems facing

AI researchers: the increasingly
high demands of computation.
“It’s a very, very important
thing to solve, because it’s the
bottleneck of machine learning
algorithms,” says Corbett.
Cutting-edge AI research
relies on vast models with
hundreds of billions of
parameters. Training these can
occupy huge supercomputers
for weeks or months at a time.
One of the largest neural
networks currently operating,
the Megatron-Turing Natural
Language Generation model,
has 530 billion parameters and
was trained on Nvidia’s Selene
supercomputer, which has
560 powerful servers and 4480
high-end graphics cards, each
costing thousands of pounds
when bought commercially.
Despite the huge power of
that machine, it took more than 
a month to train the model.
Güneş Baydin says the best-
case scenario is that this new
approach slashes the time taken
to train AI models in half, but
that is far from guaranteed. He
says time will tell what results
other researchers see when it is
tested across a range of models.
“You can run one iteration
of optimisation faster with
our algorithm, but it doesn’t
automatically mean you can
get the same result twice as fast,
because there are other things
involved,” he says. “It might
do a worse job than the
back-propagation algorithm
in some cases, and it might need
more iterations to achieve the
same quality of training. And
if that happens, maybe it can
end up like losing all your
competition advantage.”  ❚

A simple maths trick makes


training AI more efficient


YU

ICH

IRO

CH

INO

/GE

TT

Y^ IM

AG

ES

Back-propagation is
an intensive technique
used to train AI models

50%
Potential savings in time, energy
and computing power to train AI

News


Animal behaviour

Michael Le Page

THE common toad doesn’t look like
a good climber, yet citizen surveys
suggest that the amphibians often
climb trees to hide in hollows.
“The people who do surveying
for bats were like, ‘Oh yeah, we do
find toads from time to time’. But

nobody working with toads knows
this,” says Silviu Petrovan at the
University of Cambridge.
The finding emerged from
a dormouse monitoring scheme
run by the People’s Trust for
Endangered Species (PTES) in the
UK. The nesting boxes are typically
placed at least a metre above
the ground on tree trunks, so
small animals can only get into
them by climbing the trees. In
2016, a volunteer monitoring the
nesting boxes found a toad in one
and asked why it was there.
Petrovan and his team couldn’t
find any published reports of toads
climbing trees, so they asked other
volunteers with the PTES dormouse
scheme if they had seen any
amphibians. Sure enough, some
had kept records of finding toads,
even though they hadn’t been
asked to (bioRxiv, doi.org/hkqg).
Petrovan also looked at the
animals found in tree hollows as
recorded by another UK initiative,
the Bat Tree Habitat Key project.
Altogether, his team has now
found around 50 reports of
amphibians in trees, almost all of
them common toads (Bufo bufo).
Why toads climb trees isn’t
clear, but it may help them
avoid predators and parasites,
says Petrovan.  ❚

Common toads
surprise biologists
by climbing trees

HE


NR
Y^ A


ND


RE
WS


The European
common toad
(Bufo bufo)
Free download pdf