The Washington Post - 12.11.2019

(nextflipdebug5) #1

A18 eZ re the washington post.tuesday, november 12 , 2019


BY FAIZ SIDDIQUI


SAN FRANCISCO — To under-
stand the complexity of program-
ming self-driving cars, experts on
autonomous vehicles say, consid-
er a deer, a moose or a cow — all
rust-colored, four-legged mam-
mals you might find roaming
along the side of the road.
Human drivers know they be-
have differently. A machine might
not.
“If the system was never
trained on that, it would recog-
nize, ‘Oh, there’s something
here,’ ” said Danny Shapiro, se-
nior director of automotive for
NVIDIA, which is developing self-
driving technology. But “it has no
idea how it’s going to behave or
what it’s going to do.”


Distinguishing between ani-
mals that could run into the road
is part of the constant engineer-
ing struggle to identify and teach
these types of differences to vehi-
cles powered by artificial intelli-

gence.
Companies including Alpha-
bet-owned Waymo, General Mo-
tors’ Cruise division and Lyft-
a ffiliated Aptiv have been racing
to train their vehicles to drive

themselves in Silicon Valley, as
well as Phoenix, Las Vegas and
other cities nationwide. The tech-
nology is envisioned to eliminate
the need for car ownership and
revolutionize the way people get
around — something particularly
helpful with an aging population.
But the inherent risks are nu-
merous, and the vast amount of
knowledge needed to train the
vehicles is daunting.
The reality of how little some of
these vehicles know came to light
last week as an investigation by
the National Transportation Safe-
ty Board of a deadly Uber crash
revealed that the car was unable
to distinguish a person from a
vehicle or bicycle and that it
wasn’t programmed to know that
pedestrians might jaywalk.
Autonomous vehicles use a
combination of radar, lidar —
complex sensors that use laser
lights to map the environment —
and high-definition cameras to
map their surroundings. When
the cars meet a new object, the
images are rapidly processed by

the car’s artificial intelligence
based upon a vast trove of refer-
ence images of similarly labeled
objects to figure out how to react.
But there are thousands of
potential scenarios. An older per-
son might be slower than a run-
ner. A dark spot on the road could
be a shadow, a puddle or a pot-
hole. Reflections off buildings
could confuse cars that are sud-
denly seeing themselves.
In some cases, engineers ap-
pear to be “programming for
what should be, not what actually
is,” said Sally A. Applin, an an-
thropologist and research fellow
who studies the intersection be-
tween people, algorithms and
ethics. “There just seems to be a
really naive assumption about
various rules — and that the
world is going to be the way the
rules are, not necessarily the way
the world is.”
The Uber incident in particu-
lar has created frustration in the
autonomous-vehicle community,
with many fearing that a few
similar crashes could result in
tougher regulation and hinder
the development of the industry.
In that instance, an Uber vehi-
cle in Te mpe, Ariz., fatally hit a
pedestrian crossing outside a
cr osswalk with her bicycle on a
darkly lit street in March 2018.
The driver supervising the car
was looking at h er phone, author-
ities said. The car’s r adar detected
Elaine Herzberg nearly six sec-
onds before the crash, but the
self-driving system didn’t proper-
ly classify her or know how to
react.
The NTSB, which has not is-
sued a probable cause, will con-
vene Nov. 19 to make its determi-
nation.
The safety board’s report said
that “pedestrians outside a vicini-
ty of a crosswalk” were “not as-
signed an explicit goal,” meaning
the vehicle would not have pre-
dicted the path she might travel
the way it might if she were
identified as a pedestrian in a
marked crosswalk. Instead, it
identified her as a vehicle, bicycle
and “other.”
In voluntary safety reports
filed in 2018 and 2019 from com-
panies including Waymo, Aurora,
GM’s Cruise division and Ford, all
mention jaywalkers or jaywalk-
ing or give reference to pedestri-
ans outside a marked crosswalk.
Uber’s, filed in November 2018,
does not.
Uber’s report mentions pedes-
trians but indicates a far more
sophisticated system than what
played out in Arizona.
“A ctors, such as vehicles, pe-
destrians, bicyclists, and animals,
are expected to move,” t he report
said. “Our software considers
how and where all actors and
objects may move over the next
ten seconds.”
“Our self-driving vehicles will
not operate in a vacuum,” i t adds.
Uber spokeswoman Sarah Ab-
boud said the company regrets
the crash and has vastly over-
hauled its self-driving unit since
it occurred, adding that Uber
“has adopted critical program
improvements to further priori-
tize safety.”
Still, the ripple effects of what
some call a glaring programming
oversight were already being felt
in Silicon Valley.

“A baffling thing,” said Brad
Te mpleton, a longtime self-
d riving-car developer and consul-
tant who worked on Google’s
autonomous-driving project
about a decade ago. “Everyone
knew that eventually there would
be accidents because no one
imagined perfection. This one’s
worse than many people imag-
ined.”
Many autonomous-vehicle in-
dustry insiders, some of whom
spoke on the condition of ano-
nymity out of fear of retribution,
said they were surprised Uber
had not accounted for such a
basic expectation. They also ac-
knowledged the timeline to roll
out this technology is probably
longer than many expect because
of its complicated nature.
Te chnology powered by artifi-
cial intelligence, including facial
recognition and voice assistants,
has drawn criticism for issues
such as built-in human bias and
faulty logic.
As a result, researchers in the
field are calling for more caution
and diversity when it comes to
training AI — particularly for
vehicles.
There should be more focus
groups and diverse groups of
experts working to map out the
scenarios, said Katina Michael, a
professor in the School for the
Future of Innovation in Society
and School of Computing, Infor-
matics and Decision Systems En-
gineering at Arizona State Uni-
versity. More safety and mechani-
cal engineers should be working
on the code alongside software
engineers, and it should all be
peer-reviewed by experts, she
said.
When it comes to real-life sce-
narios, “the most obvious ones
haven’t been addressed,” Michael
said. “When we don’t do all this
scenario planning and don’t do
an exhaustive [job] at the front
end, this is what happens.”
Some are pushing more off-
road simulation as an extra layer
of precaution. At Applied Intu-
ition, a team of 50 — including
alumni from companies such as
Waymo, Apple and Te sla — design
simulation software for cars to
test scenarios before they are
released into the real world.
A single urban intersection can
have “a hundred thousand sce-
narios,” said Qasar Younis, the
founder of Applied Intuition.
Those need to be simulated and
accounted for before a vehicle can
be safely put into operation.
Younis said the simulators,
who supply their software to
a utonomous-vehicle companies,
want to test for “edge cases,” t he
real-world situations that might
be a one-in-a-million occurrence
but could be fatal when they
happen.
He gives the example of an
autonomous semi-truck that en-
counters a parked vehicle on the
shoulder of a highway as it enters
a crosswind. The truck doesn’t
initially see the motorist emerg-
ing from behind the car on the
shoulder, potentially going into
the lane, as the wind strikes the
trailer that could swing in the
person’s direction.
“That’s a scenario you’re not
going to want to test in the real
world,” he said.
[email protected]

What self-driving cars can’t recognize may be a matter of life and death


alex Kraus/bloomberg News
A Mercedes-Benz Vision Urbanetic autonomous vehicle is
displayed at the Frankfurt Motor Show in Germany in September.

Engineers must program
vehicles to understand
differences in behavior

MD MHIC # 1176 | VA # 2 701039723 | DC # 2 242

The CaseStudy®


If you’re looking to remodel a kitchen, add

a bathroom, finish a basement, or build an

addition, our proprietary CaseStudy® process

delivers all of the information you need to

make the best decisions for your own unique

project. We’ll collaborate on ideas, document

every inch of your space, develop three

unique design options—with your dream

design virtually rendered in 3D—and include

budgeting information and timelines.

Get inspired with us today.

CaseDesign.com | 84 4.831.59 66

What Inspires You?

It’s the question that matters most to us.


Because we’re making something special.


The one place in the world that’s yours.


Inspiring Homeowners Since 196 1.

Free download pdf