BBC Knowledge AUGUST 2017

(Jeff_L) #1
74

in the kitchen. Their robots can watch videos
of people preparing and cooking food, and,
by doing so, learn to perform similar actions.
“We use neural networks to acquire knowledge
for our robots by learning the functionality of
objects,” says Prof Yiannis Aloimonos.
“Can this tool be used for scooping; can this object
be used as a container? Our neural networks look
at many examples and they have been taught to
make geometric calculations. The combination of
deep learning with geometry leads to recognition
of the action being performed.”
These AIs learn the underlying ‘grammar’
rules of action so that they can achieve their
intended goal without necessarily needing to
perform identical motions. For example,
the rules of stirring using a spoon to repeatedly
mix a liquid in a pot apply to any liquid and
any pot. A simpler AI might only learn how
to use one specific spoon for one specific pot,
containing one specific kind of soup. This higher-
level thinking using such grammar rules is then
combined with a large number of processes that
track and monitor the hands, the objects, tools
and their movements, all continuously running 
in the background. “All of this implemented in
a robot gives rise to the robots of the future
that ‘understand’ the humans around them,
and learn from them,” explains Aloimonos.
Baomar thinks this form of robotic learning
PHOTOS: VOA NEWS, AFFECTIVA ILLUSTRATIONS: JOE WALDRON

can find countless practical applications.
“I believe that, if we give robots the ability to learn
from humans or even from other systems,
the outcome should be intelligent robots that
are capable of learning a wide spectrum of skills,
ranging from domestic chores to performing
surgery and flying complex machines,” he says.

COME WITH ME IF YOU WANT
TO LOVE
So the robots of the future are likely to be capable
of learning and performing complex, highly-
skilled tasks. But how about emotions? Humans
are complex creatures, unpredictable and often
not entirely rational. Our emotions are just as
important as our intellect in driving our actions.
Affective computing – software that recognises
and interprets our emotions – and human-
computer interaction has started to enable
AIs to detect emotions.
“We know from years of research that emotional
intelligence is a crucial component of human
intelligence,” says Dr Rana el Kaliouby, CEO of
artificial intelligence company Affectiva. “People
who have a higher Emotional Quotient [EQ] lead
more successful professional and personal lives,
are healthier, and even live longer.”
Affectiva is using deep learning, a special kind
of neural network containing many layers of
neurons, to enable computers to detect our

ABOVE:
Robot chefs
could spell
the end of
sweating over
a hot stove

FACING
PAGE:
Dr Rana
el Kaliouby
demonstrates
emotion-
sensing
technology
used by her
company’s
artificial
intelligence

Robotics


science

Free download pdf