TPi Magazine – August 2019

(Nora) #1

GLASTONBURY


system. “Due to the video-led nature of the structure, it was great to be able
to blend the video content and lighting with Synergy to achieve cohesion
between the content and the lighting fixtures,” De Villiers explained.
For the kit, De Villiers chose the Avolites Arena for the lighting control.
“The Arena is my go-to desk for many reasons; it has a large number of
faders and execute buttons and the mini screen can be quite handy,” he
explained. “The native optical output is also really useful; on previous jobs
we’ve had a mile-long run of fixtures with no latency at all. The large main
screen is also great for the Pixel Mapper and NDI overlay.”
To handle the video, three Avolites Ai Q3s were brought in; one main,
one for Synergy and one as backup. A Titan Net Processor was also
installed to create a content distribution and ser ver management network.
For software, the Arena ran a beta of Titan v12 with the Q3s running the
newly released Ai v11. The media ser vers used five HD outputs with the live
input running at 2,048 x 2,048 resolution.
To bring the set to life, the team needed content and on a large scale.
Limbic Cinema was commissioned back in 2017 to curate the video content
and since then it has been building up a librar y of video of all different
artistic styles, textures and colours. Once the stage was built, a team of five
designers was on site, creating more content.
Formatting video to the preferred video codec of the ser ver can be a
major issue when working with large teams of designers, but a key feature
of Ai allows content of any format to be fed into the software and the Ai
Transcoder automatically converts it into the AiM Codec. It also didn’t
matter what size or shape the content was. Once the content was uploaded

to the ser ver, the Ai Mapping Editor could process it and map the content to
the structure with ease.
The next stage was to bring the show together. Five HD projectors were
used for the mapping and a total of 65 fixtures including 12 Aqua beams
that surrounded the face of the Temple were brought in. The trick was to
make it all work cohesively.
To create a fully immersive experience for the attendees, it was vital all
of the visuals told the stor y together; this was where Synergy took centre
stage. De Villiers used Lightmap – a key Synergy feature – that allowed him
to directly pixel map the video content to the fixtures. “The structure was
mapped to the pixel mapper on the desk and from there I could control
how much the video content affected the lighting using a mode 2 fader.
This allowed us to make a smooth transition between the video cues and
lighting,” he said.
Getting this map right was vital. The original UV map of the stage was
complicated and didn’t match up directly with the stage itself, so a camera
was set up in Ai with the live feed going into the Synergy Q3 ser ver. The live
output from this was then fed to the Arena, giving De Villiers the picture, he
needed to design his show.
Projection mapping such a complex structure was no mean feat. Many
of the areas of the outer structure were layered and therefore regular
edge blending was not suitable. Lighting Designer Rothwell-Eyre used
the Salvation Patching in Ai to add masks to certain areas, allowing the
projectors to map out the structure accurately. Once the preparations were
done, it was time to go live. The video was operated by multiple people, all
Free download pdf