Time March 2–9, 2020
INEQUALITY| THE MARCH
Tucked away in an office on a quieT Los angeLes
street, past hallways chockablock with miniature props
and movie posters, is a cavernous motion-capture studio.
And in that studio is the National Mall in Washington,
D.C., in 1963, on the day Martin Luther King Jr. delivered
his “I Have a Dream” speech.
Or rather, it was inside that room that the visual- effects
studio Digital Domain captured the expressions, move-
ments and spirit of King, so that he could appear digitally
in The March, a virtual reality experience that TIME has
produced in partnership with the civil rights leader’s es-
tate. The experience, which is executive- produced and
narrated by actor Viola Davis, draws on more than a de-
cade of research in machine learning and human anatomy
to create a visually striking re-creation of the country’s Na-
tional Mall circa 1963—and of King himself.
When work on the project began more than three
years ago, a big question needed answering. Was the
existing technology capable of accomplishing the proj-
ect’s goals—not just creating a stunningly realistic digi-
tal human, but doing so in a way that met the standards
demanded by the subject matter? And Alton Glass, who
co-created The March with TIME’s Mia Tramz, points
out that another goal was just as key: the creation of what
Glass calls a prosthetic memory—something people can
use to see a famous historic moment through a differ-
ent perspective, to surround themselves with those who
were willing to make sacrifices in the past for the sake of
a more inclusive future. “When you watch these stories,
they’re more powerful,” says Glass, “because you’re actu-
ally experiencing them instead of reading about them.”
Back in the late ’90s, when Digital Domain used
motion-capture footage of stunt performers falling onto
airbags to create Titanic’s harrowing scene of passengers
jumping from the doomed ship, digitizing those stunts
required covering each actor’s body with colorful tape
and other markers for reference. To animate faces, an
actor’s would be covered with anywhere from a dozen
to hundreds of marker dots, used to map their features
to a digital one. On the double, those points would be
moved manually, frame by frame, to cre-
ate expressions. That arduous task was
essential to avoid falling into the so-
called uncanny valley, a term referring
to digital or robotic humans that look
just wrong enough to be unsettling. The
work has gotten easier over the years—
the company turned to automation
for help making the Avengers baddie
Thanos —but remains far from simple.
Calling on the artists behind a fan-
tastical being like Thanos might seem
like an unusual choice for a project
that needed to be closely matched to
real history, but similar know-how is
needed, says Peter Martin, CEO of the
virtual- and experience-focused creative
agency V.A.L.I.S. studio, which part-
nered with TIME and Digital Domain.
Re-creating the 1963 March on
Washington would still stretch the
bounds of that experience. For one
thing, virtual reality raises its own ob-
stacles. High-end VR headsets that fit
over your face achieve their graphi-
cal quality via a wired connection to a
pricey gaming computer. The March
is presented in a museum with high-
powered computers, but a wireless op-
tion is needed to allow users to more
easily move around in that space. It took
Digital Domain’s technology director
Lance Van Nostrand months to create
a system that would solve for wireless-
ness without compromising quality.
Considering the difficulties of travel-
ing back in time to August 1963, Digital
Domain sent a crew to the National Mall
and used photogrammetry—a method
of extracting measurements and other
data from photographs—to digitally
Eyes on history
INSIDE A VIRTUAL REALITY RE-CREATION OF THE 1963 MARCH ON WASHINGTON
BY PATRICK LUCAS AUSTIN