New Scientist - USA (2020-04-04)

(Antfer) #1

14 | New Scientist | 4 April 2020


AN ARTIFICIAL intelligence can
accurately translate thoughts
into sentences, at least for a
limited vocabulary of 250 words.
The system may bring us a step
closer to restoring speech to
people who have lost the ability
because of paralysis.
Joseph Makin at the University
of California, San Francisco, and
his colleagues used deep learning
algorithms to study the brain
signals of four women as they
spoke. The women, who all
have epilepsy, already had
electrodes attached to their
brains to monitor seizures.
Each woman was asked to read
aloud from a set of sentences as
the team measured brain activity.
The largest group of sentences
contained 250 unique words.
The team fed this brain activity
to a neural network algorithm,
training it to identify regularly
occurring patterns that could
be linked to repeated aspects
of speech, such as vowels or
consonants. These patterns
were then fed to a second neural
network, which tried to turn them
into words to form a sentence.

Each woman repeated the
sentences at least twice, and the
final repetition didn’t form part
of the training data, allowing the
researchers to test the system.
Each time a person speaks the
same sentence, the brain activity
associated will be similar but not
identical. “Memorising the brain
activity of these sentences
wouldn’t help, so the network
instead has to learn what’s similar

about them so that it can generalise
to this final example,” says Makin.
Across the four women, the AI’s
best performance was an average
translation error rate of 3 per
cent (Nature Neuroscience, DOI:
10.1038/s41593-020-0608-8).
Makin says that using a small
number of sentences made it
easier for the AI to learn which
words tend to follow others.
For example, the AI was able to
decode that the word “Turner”
was always likely to follow the
word “Tina” in this set of sentences,
from brain activity alone.

The team tried decoding the
brain signal data into individual
words at a time, rather than whole
sentences, but this increased the
error rate to 38 per cent even for
the best performance. “So the
network clearly is learning facts
about which words go together,
and not just which neural activity
maps to which words,” says Makin.
This will make it hard to scale up
the system to a larger vocabulary
because each new word increases
the number of possible sentences,
reducing accuracy.
Makin says 250 words could
still be useful for people who can’t
talk. “We want to deploy this in
a patient with an actual speech
disability,” he says, although it
is possible their brain activity
may be different from that of
the women in this study, making
this more difficult.
Sophie Scott at University
College London says we are a long
way from being able to translate
brain signal data comprehensively.
“You probably know around
350,000 words, so it’s still an
incredibly restricted set of speech
that they’re using,” she says. ❚

“ Three volunteers who
tasted it said it replicated
the sensation and
texture of a meat bite”

Artificial intelligence

Jason Arunn Murugesu

AG

EFO

TO

ST
OC

K/A

LA
MY

News


An AI is using brain activity
patterns to predict the words
we are thinking of saying

Biotechnology

Soya plus cow cells
makes artificial beef
with a meaty texture

LAB-grown “beef” is being made
by culturing cow muscle cells
within a spongy scaffold of soya
bean protein.
Prototypes of this cultured
meat have passed initial taste tests,
says developer Shulamit Levenberg
at Aleph Farms in Ashdod, Israel.
The idea behind cultured beef
is that it could be as tasty as real
meat without any animals having
to be killed. It may also be better

for the environment, although
this isn’t clear.
Cultured meat development
has taken off in the past few years,
with about 50 companies now
attempting to perfect a recipe.
A few have got to the stage
of creating prototype samples
for tasting, but nothing is yet
on offer in shops or restaurants.
Aside from the high cost of
growing biological tissue in a
dish, one problem is that meat
doesn’t just consist of muscle
cells. In animal flesh, these cells
sit within a supporting scaffold
of extracellular protein, which

has to be mimicked to give the
product a similar texture to real
beef. “You want to recreate the
tissue as it is in the animal,” says
Elliot Swartz at the Good Food
Institute in Washington DC.
At the moment, cultured meat
uses a scaffold that is often derived
from beef gelatin, a collagen protein
obtained by boiling carcasses from
slaughterhouses. This is a problem if
vegetarians are your target market.

Now Aleph Farms may have
found an alternative: textured soya
protein, which is a by-product of
soya-bean oil manufacture and is
already used in many vegetarian
substitutes for meat.
The team grew cow muscle
and blood vessel cells on a spongy
scaffold of soya protein, then baked
or fried small morsels of the fake
meat (Nature Food, DOI: 10.1038/
s43016-020-0046-5).
Three volunteers who tasted
the cultured meat said it replicated
“the sensation and texture of a
meat bite”, the researchers said. ❚
Clare Wilson

Mind-reading AI turns thoughts


into words using a brain implant

Free download pdf