Sound
Garry Brown, Phish's front-of-house engineer, is only the second person to use the Sphere's enormous sound system, which consists of approximately 1,600 permanently installed and 300 mobile HOLOPLOT X1 Matrix Array loudspeaker modules and includes 167,000 individually amplified (by Powersoft) loudspeaker drivers. The system utilizes HOLOPLOT's 3D Audio-beamforming and Wave Field Synthesis technology to produce a highly immersive audio environment. "My experience with it was really positive," Brown says. "We were the first to use the HOLOPLOT system in the Beacon Theatre [on New York's Upper West Side] after it was installed there." A previous demo in Burbank impressed him, too, but, he says, "Going into the Beacon and seeing the beam-steering being dealt with in a venue showed how powerful it is. It was the best-sounding show I've ever done there.
"Once the Sphere came up," he says, "I had no concerns about working with the HOLOPLOT system. There are 1,800 modules in the Sphere, so it's a hell of a lot of speakers, but, based on the Beacon Theatre, I knew the headroom would be great. Also, I had worked with a mini-Sphere system at Rock Lititz, figuring out how to get everything routed into it; the Sphere guys have it at Clair Global, so I could get up and running with it. Once we showed up, plugged in, and fired up, it sounded amazing."
Interestingly, he says, "My mix did not change. I built the Sphere file from my normal show file. The difference was in how things were routed to the PA. I had 44 drive lines going into the PA and another 20 drive lines going into moving objects in D-Mitri [Meyer Sound's digital audio processing and distribution] and Spacemap Go [Meyer's spatial sound design and mixing tool]. There are seven sections to the proscenium array, with 40 boxes per section. Then, on the same level, are another six arrays, with 20 boxes each. They're part of the immersive element. Then there's the wall, which is all over the place. I was able to address 44 arrays directly but most of my rig lived within the 13 arrays in front of me."
Thus, Brown says "I had to start picking where I was placing things -- say, the kick drum system in array eight and the hi-hats in array nine." Such placements had to be carefully considered. "Having put my drum group in the surround element, I couldn't put it in more than one array, because I'd start to get a time issue. If they were 100 milliseconds apart in time, everybody would hear it. That's the mind-blowing thing about the Sphere: Technically, everybody can hear every array. How do they do that?"
He adds, "Working on the mini-Sphere at Clair, I was able to learn pretty quickly, that our surround system wasn't going to work, and I had to start addressing things directly. And it worked; I still had a stereo image, just not a stereo image in the same way."
In the mini-Sphere, Brown says, "I had an iPad, which let me move around 13 locations virtually. It changed the delay times on the 44 speakers around me. Then I placed myself in section 201, which is far house left, and built my routing based on what worked there. When I went back to the center, it became better and wider. It didn't fall apart. Ultimately, it felt completely normal, but if you went to house left, it felt like the keyboard player was pushing the band; if you went to house right, it felt like he was laid back. The feel of the music changed a little bit, but it all stayed together."
Mixing a jam band, he says, "I could go further afield in terms of the immersive elements and movement of stuff. We have a song structure, and everything has its home base. Most of the time, things were in their home location, so the mix was coherent everywhere. When they started jamming, I'd move things around the room. But I always went back to home, so the songs stayed together. I had to decide how to do it. If Page [McConnell] was playing the piano, there was very little I could do with that; it's a percussive instrument and, [moving it around], I would start having issues. But if they were doing spacey sounds, I had more freedom. It was all busked. My console was set up so that I could assign anything to any of the moving objects. I had no scene; it was just me doing it."
Brown's console of choice is the Lawn MC2 56, a product much better known for broadcast applications. "I went over to it about 18 months ago," he says. 'HOLOPLOT brought it to the demo. It blew me away, sonically. I had a demo at Clair at Rock Lititz and decided to try it at a Trey Anastasio Band show. And it was unbelievable. The console has yet to make much headway in the live world. "People are apprehensive about them because they think they're expensive," Brown says. "But the MC2 36 is basically the same as the RIVAGE PM5. I chose the 56 because I prefer the surface and I needed more channel count. But it sounds amazing. It was a learning curve for the workflow." But, since he made the change, he says he has eliminated much of his outboard gear. "My stereo buss is a [Rupert Neve] Master Buss Processor with a Kush Audio Clariphonic. I had an Overstayer M-A-S and a Master Buss Processor on the drum buss, and a Shelford Channel and Distressor on the bass guitar. And that's it."
In addition, Mark Bradley, the band's longtime monitor engineer, uses a Yamaha PM10 RIVAGE console; Robert Caprio, who handles Anastasio, is also on a Lawo console.
The band members are on a mix of in-ears from Sensaphonics, Ultimate Ears, and JH Audio. Mics, Brown says, are mostly Sennheiser or Shure, except for a Royer SF24 for the overhead. The guitar mic is the Royer 122V. Vocal mics are Telefunken M81s.
Overall, Brown says, "I had a great time. The Sphere guys were amazing. Realistically, I was ready to do a show after one day with the mini-Sphere. We spent the rest of the time trying to figure out the workflow on the console, to make it do what I wanted. We had Lawo write some code so I could do different things. The biggest question was, What would it be like with the band in front of the PA? But, because of the headroom and because we're not a loud show, it was okay. I think I fed back once."
"There were little things to deal with. It was a bit of a learning experience for the guys onstage because there were artifacts with the DSP processing; the system was putting 4K out into places where it shouldn't really be, and they couldn't get rid of it. It may have been a bigger issue for the monitor guys. The PA is behind them, and, because of the processing, it was 80 or 90 milliseconds behind the acoustic input. Basically, we went to the Sphere, plugged it in, turned it on, and it sounded amazing."
Bravely going where many rock acts would fear to tread, Phish signed on as the second live attraction at Sphere Las Vegas after U2's residency there last fall. How does one follow a legendary act in a new venue unlike anything seen before? Phish did it by upping the ante, appearing for four nights only, offering a distinct show each night. This meant dedicated visuals for each show on the Sphere's 160,000-sq.-ft. LED screen. As Shirley Halperin and Noah Eckstein wrote in Los Angeles Magazine, "U2 christened the dome, but you could say Phish truly broke it in."
According to Los Angeles Magazine, "The Sphere's vast space allowed for all sorts of graphic images, from the hyper-realistic (a dog licking the audience on night two) to the surreal (night three's hieroglyphic mill) to the kitschy (night one's Max Headroom-meets-Windows 95 cartoons of cars and television sets; night four's towering neon-colored skeletal robots), and the trippy (melting motifs in colorful brushes and shapes and somewhat disorienting black-and-white patterns of stars and trails). There was also the serene: clouds morphing into doves (night three); a rustic New England barn and meadow (night one); an underweater expanse (night two)."
As Abigail Rosen Holmes, the event's co-creative director (along with Montreal-based Moment Factory), told Billboard, the creative team had a guiding principle: "If you would do this for one of the other artists you work with, it's probably not unique enough to be for Phish." Moment Factroy collaborated closely with Holmes, its involvement extending to set design, video design, and production, as well as contributing to the lighting concepts in collaboration with the band's longtime lighting designer, Chris Kuroda.
It was the trickiest of projects because Phish, a jam band, never repeats itself; at any show, the music can head in all sorts of directions. However, Daniel Jean, producer for shows at Moment Factory, says the company "got its start VJ-ing in the late '90s, and we are particularly proud, nearly 25 years later, to have maintained this initial passion. We are now pushing it to an unprecedented level, creating the longest-ever real-time content show on the world's most immersive canvas at Sphere."
Jean admits that the sheer size and scope of the project was daunting, but notes, "We have such talented technicsl people in-house; we know where technology is going and we're happy to hijack it sometimes to push our limits and take a risk. But we had to go the extra mile, and we didn't do it alone. If it were for Fuse [Technical Group] and Disguise [the media server company], we would probably not have had four shows in four days."
Holmes, the shows' director, says, "The first thing we talked about was the nature of the shows, which involve improvisaiton. And that we were doing 68 songs over four nights." According to the Washington Post, it was she who convinced Phish's Trey Anastasio not to extend the run, saying that if the band needed another four-show run, "It'll be good, but it won't be great. If you just do four nights, it's going to blow minds."
Holmes says each of the shows was based on one of four elements: solid, liquid, gas, and plasma. She also concentrated on creating room for a light rig with Jean-Baptiste Hardoin, of Moment Factory, designed the lozenge-shaped stage. (To be sure, Holmes adds, the project was "such a team effort," with everyone contributing to every aspect.) The stage was built by TAIT with Atomic Design contributing large props that could fold down and be easily hidden in the vast open space of the Sphere.
The time frame was not generous. "We started production in December," Jean says. "We had two months prior to that, imagining the content and stage design. The first batch of content was delivered in March. But we were still making content five days before the show and adjusting other content as well." It wasn't just a matter of generating miles of imagery: "Having the real-time content meant that we deployed a new system in the Sphere." (More about that in a moment). "When we implement a new system, we need to stabilize it and do the right user interface."
Justin Restaino, the concerts' screen producer, says, "Moment Factory built a previsualizer in the Unreal Engine where we could put on a VR headset and feel like we were in the Sphere. It was an incredibly helpful tool."
But, of course, nobody on the creative team had ever worked on a surface like this before, which meant a certain amount of trail and error. Jean says, "We produced a large amount of content that didn't make it because in our first test in the Sphere, we realized that some content did not read well." There was also the question of matching content to songs and finding throughlines for each evening: "When you have a three-and-a-half -hour show, you build all these looks and assign concepts. Then you realize that some of it won't make sense, depending on where we are in the show. We wanted to give each show an emotional curve."
Restaino says, "For a long time, we were reheasing, playing content without the band onstage. When they got there and started jamming, the feeling of that content completely changed. Trey Anastasio made a great comment: 'We need to make sure that we're painting the same picture, from the music to the lighting, audio, and contnet.' The show worked best when we were all doing that." It was, Restaino adds, a massive project in more ways than one. 'On the last night, 'Down with Disease' went on for 32 minutes. It was incredible. The audience was loving it. But being able to develop contengt not just in terms of size but quantity, to adapt and change with the jam, was unique."
Indeed, a big challenge was being able to busk video content during the jam sessions. "We had content created for specific songs," Restaino notes. "For examples, for 'Ghost,' we had this electric pole robot character. Others geatured what we call 'library content,' which could be live-busked." He adds that every time one hears a Phish song performed live, "That's the first time anyone is hearing that iteration of it. We really strove to have the same effect with our VJ content; what audiences were experiencing visually for the first time, we were, too."
In an unusual, possibly groundbreaking, application, the imagery was delivered by Disguise media servers linked to Unreal Engine creation tools. "Through the Disguise server, we were running Unreal Engine within Display," Restaino says. "We had 32 media-servers -- 16 for the show and 16 for backup -- and, for Unreal Engine, we had 32 render nodes. It's one of the first times this type of intergration has happened. I can safely say it's the first time it has been done at this scale. We were able to synch all that content together because it was, essentially, 32 computers driving the same content, all synched to the same millisecond."
He adds, "The Disguise team worked incredibly hard in collaboration with us, making sure that their servers performed efficiently; ultimately, what we had was a seamless integration with 13 Unreal Engine scenes, all performing in real time and controlled through the grandMA console." This was a grandMA3 using the MA3 software. "In the case of video, we started from scratch [with no previously existing show files], so using MA3 made sense," Holmes says.
The video department's grandMA console was separate from the lighting department system's controller. "We had a grandMA operated and controlled by Abby, which was hooked up, via Sockpuppet to the Disguise server, which ran Unreal Engine," Restaino says. "Our programmer, Benjamin Roy [of the firm Earlybird] is a brilliant individual who hooked up all the parameters and linked the grandMA with Disguise."
Holmes also notes Roy's crucial contribution, along with that of Andrew Giffin, lighting programmer for Phish, adding, "The live video playback and control were made possible thanks to the extensive custom code scripting within the console." She adds, "Jamie Van Dyke, our screens producer, is an amazing asset. Her skils are incredible." Van Dyke also called moving scenic and dolly cues in the shows.
Fuse Technical Group served as the technical integrator, carrying out the delivery of Moment Factory's content to the server at the direction of Van Dyke. Moment Factory also enlisted Fly Studio, Myreze, and Sila Sveta for screen content production; Troublemakers for audio design during Phish's stage entrance; and Picnic Dinner Studios and Totem Studio for 2D animations during the intermission.
Lighting
Lighting designer Chris Kuroda notes that the Sphere runs multiple daily screenings of Darren Aronofsky's film Postcards from Earth, which made for some scheduling difficulties. "We were allowed to work in the building only from 1am to 10am," a plan that played havoc with his circadian rhythms. Then again, Kuroda has worked with Phish in all sorts of situations, including the band's enormous New Year's Eve spectacles at Madison Square Garden, so he was ready for another challenge. But the Sphere was something else altogether. "Obviously, the video walls takes precedence over everything; it's what the room is about," he says. "Eventually, I found a happy place where I wasn't interfering with the video and where I could be more present with the lighting when the moment required it." Still, he notes, "It was hard because I have a kind of muscle memory from lighting Phish and I had to hold back until the time was right."
The time was often right during the latter parts of the jam sequences, he says. "The video content tended to time out when working on a 17-minute-long jam. There was often a point where the video had had it's moment and I had to bring out the lighting to keep things interesring until they were done."
Still, Kuroda says, "I had to start from scratch. The lighting rig was very, very small; it had to be, so you could see the video wall." This explains the six lighting towers placed along the stage's perimeter. "We couldn't hang anything from the ceiling," he adds, "simply because it is 385' in the air. And to hang anything up there, you'd have to pop out a video panel to hang your chain motor and truss. We didn't want to break up the picture."
The towers were packed with Robe iFORTE white LED spots units, Robe Spiider multisource LED moving units, and GLP JDC1 strobes. "I'm a big fan of hybrid lights," Kuroda says, adding that beam units felt inappropriate because "I didn't want to make little spots all over the walls. We decided to keep it very simple." Running along the edge of the stage was an additional set of iFORTES.
Also, he says, "The building has 108 Elation Artiste Mondrians built into what you might call the balcony rail; we used them to accomplish our side light, low light, high light, and front light, everything the towers weren't doing." The lighting was controlled by a grandMA console running MA2 software.
Kuroda did a remarkable job of carving the band out of the enormous pixel-filled space, often using saturated side washes. Key to the success of the show, he notes, was working out a common language with Holmes, who was operating the video. "We spent time creating a visual vocabulary to communicate what we thought were the right times for us to do different things. It was all on the fly -- we were feeling it out -- and we felt really good about how it went in the end. I have a feeling for how she works; we've done a few projects together. She has attended many Phish shows and has helped us out in many ways."
Brief as it was, the Phish residency provides solid evidence that the U2 run at the Sphere wasn't just a fabulous one-off, and that other acts can turn the unique venue to their creative advantage. It leaves one wondering, with anticipation, what future stars will make of it. In any case, Phish has set a high bar to meet.
The article originally appeared in Lighting&Sound America.