The Economist - USA (2020-11-21)

(Antfer) #1

72 Science & technology The EconomistNovember 21st 2020


2 pest and the Franco-German-Spanish
Future Combat Air System (fcas)—are cur-
rently “optionally manned”. There are sev-
eral reasons for this, explains Nick Colo-
simo, a lead engineer at bae Systems,
Tempest’s chief contractor.
One is that eliminating the pilot does
not provide much of a saving. The cockpit
plus the assorted systems needed to keep a
human being alive and happy at high alti-
tude—cabin pressure, for example—con-
tribute only 1-2% of a plane’s weight. A sec-
ond is that even ai systems of great
virtuosity have shortcomings. They tend
not to be able to convey how they came to a
decision, which makes it harder to under-
stand why they made a mistake. They are
also narrowly trained for specific applica-
tions and thus fail badly when outside the
limits of that training or in response to
“spoofing” by adversaries.
An example of this inflexibility is that,
at one point in the AlphaDogfight trials, the
organisers threw in a cruise missile to see
what would happen. Cruise missiles follow
preordained flight paths, so behave more
simply than piloted jets. The ai pilots
struggled with this because, paradoxically,
they had beaten the missile in an earlier
round and were now trained for more de-
manding threats. “A human pilot would
have had no problem,” observes Chris De-
May, who runs the apl’s part of ace. “aiis
only as smart as the training you give it.”
This matters not only in the context of
immediate military success. Many people
worry about handing too much autonomy
to weapons of war—particularly when ci-
vilian casualties are possible. Internation-
al humanitarian law requires that any civil-
ian harm caused by an attack be no more
than proportionate to the military advan-
tage hoped for. An ai, which would be hard
to imbue with relevant strategic and politi-
cal knowledge, might not be able to judge
for itself whether an attack was permitted.
Of course, a human being could pilot an
uncrewed plane remotely, says Mr Colo-
simo. But he doubts that communications
links will ever be sufficiently dependable,
given the “contested and congested elec-
tromagnetic environment”. In some cases,
losing communications is no big deal; a
plane can fly home. In others, it is an unac-
ceptable risk. For instance, fcasaircraft in-
tended for France’s air force will carry that
country’s air-to-surface nuclear missiles.
The priority for now, therefore, is what
armed forces call “manned-unmanned
teaming”. In this, a pilot hands off some
tasks to a computer while managing oth-
ers. Today’s pilots no longer need to point
their radars in the right direction manual-
ly, for instance. But they are still forced to
accelerate or turn to alter the chances of the
success of a shot, says Colonel Javorsek.
Those, he says, “are tasks that are very well
suited to hand over”.

One example of such a handover comes
from Lockheed Martin, an American aero-
space giant. It is developing a missile-
avoidance system that can tell which air-
craft in a formation of several planes is the
target of a particular missile attack, and
what evasive actions are needed. This is
something that currently requires the in-
terpretation by a human being of several
different displays of data.
Another example is ground-collision
avoidance. In 2018 a team led by the Ameri-
can air force, and including Lockheed Mar-
tin, won the Collier Trophy, an award for
the greatest achievement in aeronautics in
America, for its Automatic Ground Colli-
sion Avoidance System, which takes con-
trol of a plane if it is about to plough into
the terrain. Such accidents, which can hap-
pen if a pilot experiencing severe g-forces
passes out, account for three-quarters of
the deaths of f-16 pilots. So far, the system
has saved the lives of ten such pilots.

A dog in the fight?
Eventually, darpaplans to pit teams of two
planes against each other, each team being
controlled jointly by a human and an ai.
Many air forces hope that, one day, a single
human pilot might even orchestrate,
though not micromanage, a whole fleet of
accompanying unmanned planes.
For this to work, the interaction be-
tween human and machine will need to be
seamless. Here, as Suzy Broadbent, a hu-
man-factors psychologist at bae, observes,
the video-game and digital-health indus-
tries both have contributions to make. Un-
der her direction, Tempest’s engineers are
working on “adaptive autonomy”, in which
sensors measure a pilot’s sweat, heart-rate,
brain activity and eye movement in order
to judge whether he or she is getting over-
whelmed and needs help. This approach
has been tested in light aircraft, and further

tests will be conducted next year in Ty-
phoons, fighter jets made by a European
consortium that includes bae.
Ms Broadbent’s team is also experi-
menting with novel ways to deliver infor-
mation to a pilot, from a Twitter-like feed
to an anthropomorphic avatar. “People
think the avatar option might be a bit ridic-
ulous,” says Ms Broadbent, who raises the
spectre of Clippy, a famously irritating
talking paper clip that harangued users of
Microsoft Office in the 1990s and 2000s.
“Actually, think about the information we
get from each other’s faces. Could a calm-
ing voice or smiling face help?”
Getting humans to trust machines is
not a formality. Mr Colosimo points to the
example of an automated weather-infor-
mation service introduced on aircraft 25
years ago. “There was some resistance from
the test pilots in terms of whether they
could actually trust that information, as
opposed to radioing through to air traffic
control and speaking to a human.” Surren-
dering greater control requires breaking
down such psychological barriers.
One of the aims of AlphaDogfight, says
Mr DeMay, was to do just that by bringing
pilots together with airesearchers, and let-
ting them interact. Unsurprisingly, more
grizzled stick-jockeys tend to be set in their
ways. “The older pilots who grew up con-
trolling the radar angle...see this sort of
technology as a threat,” says Colonel Javor-
sek. “The younger generation, the digital
natives that are coming up through the
pipeline...trust these autonomous sys-
tems.” That is good news for darpa; per-
haps less so for Colonel Javorsek. “These
things that I’m doing can be rather hazard-
ous to one’s personal career”, the 43-year-
old officer observes, “given that the people
who make decisions on what happens to
me are not the 25-year-old ones. They tend
to be the 50-year-old ones.”^7
Free download pdf