The Economist - USA (2020-08-08)

(Antfer) #1
TheEconomistAugust 8th 2020 63

1

The sec said, “Musk,/your tweets are a
blight./They really could cost you your
job,/if you don’t stop/all this tweeting
at night.”/...Then Musk cried, “Why?/The
tweets I wrote are not mean,/I don’t use all-
caps/and I’m sure that my tweets are clean.”/
“But your tweets can move markets/and
that’s why we’re sore./You may be a genius/
and a billionaire,/but that doesn’t give you
the right to be a bore!”

T


he precedinglines—describing Tesla
and SpaceX founder Elon Musk’s run-
ins with the Securities and Exchange Com-
mission, an American financial regula-
tor—are not the product of some aspiring
21st-century Dr Seuss. They come from a
poem written by a computer running a
piece of software called Generative Pre-
Trained Transformer 3. gpt-3, as it is more
commonly known, was developed by
Openai, an artificial-intelligence (ai) lab-
oratory based in San Francisco, and which
Mr Musk helped found. It represents the
latest advance in one of the most studied
areas of ai: giving computers the ability to
generate sophisticated, human-like text.


The software is built on the idea of a
“language model”. This aims to represent a
language statistically, mapping the proba-
bility with which words follow other
words—for instance, how often “red” is fol-
lowed by “rose”. The same sort of analysis
can be performed on sentences, or even en-
tire paragraphs. Such a model can then be
given a prompt—“a poem about red roses
in the style of Sylvia Plath”, say—and it will
dig through its set of statistical relation-
ships to come up with some text that
matches the description.
Actually building such a language mod-
el, though, is a big job. This is where ai—or
machine learning, a particular subfield of
ai—comes in. By trawling through enor-
mous volumes of written text, and learning

by trial and error from millions of attempts
at text prediction, a computer can crunch
through the laborious task of mapping out
those statistical relationships.
The more text to which an algorithm
can be exposed, and the more complex you
can make the algorithm, the better it per-
forms. And what sets gpt-3 apart is its un-
precedented scale. The model that under-
pins gpt-3 boasts 175bn parameters, each of
which can be individually tweaked—an or-
der of magnitude larger than any of its pre-
decessors. It was trained on the biggest set
of text ever amassed, a mixture of books,
Wikipedia and Common Crawl, a set of bil-
lions of pages of text scraped from every
corner of the internet.

Statistically speaking
The results can be impressive. In mid-July
Openaigave an early version of the soft-
ware to selected individuals, to allow them
to explore what it could do. Arram Sabeti,
an artist, demonstrated gpt-3’s ability to
write short stories, including a hard-boiled
detective story starring Harry Potter (“Har-
ry Potter, in ratty tweed suit, unpressed
shirt and unshined shoes, sits behind the
desk looking haggard, rumpled and embit-
tered...”), comedy sketches, and even poet-
ry (including the poem with which this ar-
ticle opens, titled “Elon Musk by Dr Seuss”).
Elliot Turner, an airesearcher and entre-
preneur, demonstrated how the model
could be used to translate rude messages
into politer ones, something that might be

Artificial intelligence


Bit-lit


A new language-generating aican be eerily human-like—for better and for worse


Science & technology


64 Moreefficientcovid-19testing
65 Ocean-goinginsects
65 Building a better barbecue

Also in this section
Free download pdf