Skip to main content

In this episode, (Part 1 of 2), Neil and Jason talk in-depth and ‘totally technical’ to the Oscar-winning Production Sound Mixer Simon Hayes, and on set music editor Josh Winslade, about their complex and unique playback and recording arrangements for capturing live vocals and authentic performances from lead actors Cynthia Erivo and Ariana Grande; as well as providing an insight into how to keep Jeff Goldblum happy in-between takes (and it partially involves jazz…).
This is the most comprehensive account of the Wicked gig and rig ever recorded, and best of all, it’s in the words of Wicked’s own sound wizards.

Pictured: Wicked Production Sound Mixer Simon Hayes (left) with On Set Music Editor Josh Winslade

About the presenters:
You can find more about Simon and his work here
Josh Winslade on IMDb
Details about Neil and Jason’s work as dialogue editors and mixers and how to contact them is here
Details of our 1-to-1, training and coaching programmes for ambitious media professionals are available at:
https://www.drneilhillman.com and https://soundproducer.com.au/coaching and www.soundformovingpictures.com

Technical notes:
Written, produced and presented by Jason Nicholas and Dr Neil Hillman – IMDb
Recorded using the CleanFeed remote recording system
Programme edited by Jason Nicholas

YouTube fair use disclaimer:
Where copyrighted material appears in episodes of The Apple and Biscuit Show, it is used under the ‘fair use’ guidelines of the Copyright Act: i.e. “Use of these clips follows Fair Use laws regarding commenting and criticizing”, where Fair Use allows for the unlicensed use of copyrighted material for purposes such as Commentary, Criticism, Parody, News reporting, Teaching, Scholarship, and Research.

In instances where copyright or credit is questioned, please contact us directly to discuss receiving credit, or removing the featured content.

Transcript:

Announcer Rosie
You’re listening to the Apple and Biscuit Show with Jason Nicholas and Dr Neil Hillman.
Neil Hillman
Hello and welcome to this special edition of the show, a strand we’re calling Tech Spec because in this episode we’re looking in detail at the technical aspects and the workflow of the production sound for the biggest musical of all time, the movie Wicked, starring Cynthia Erivo playing Elphaba and Ariana Grande as Galinda and which is currently playing to packed houses in theatres worldwide. This is a privileged insight for us into the logistics of how the live vocals were captured on set and how accurate playback was provided for the amazingly complex dance numbers with at times hundreds of dancers and it’s provided by the movies production sound mixer Simon Hayes, a good friend of the show, and Josh Winslade, the on -set music editor. And I’m pleased to say that many of the questions that we’ll be putting to Simon and Josh have been sent to us by fellow sound professionals who listen to the show and who are keen to hear how it all worked and how it all came together. Simon has an enviable reputation as an in -demand production sound mixer, particularly when a musical requires live vocals to be captured on set. In 2013, he was awarded an Oscar for best achievement in sound mixing for the movie Les Misérables, also winning a BAFTA for his work on the film. He is also responsible for the production sound of the musicals The Little Mermaid, Cats, Aladdin, Yesterday, Mary Poppins Returns and Mamma Mia. Today we also have the opportunity to welcome Josh along, and to learn more about his work as an on-set music editor. Welcome to you both. Are you well? And where do we find you both today?
Simon Hayes
I’m at home in London, just thinking about next year and how we’re going to be putting our workflows together for a couple of movies we’ve got planned and just really enjoying a little break in between films.
Josh Winslade
Likewise, yeah. Sat in my home in Tring, just outside of London, looking forward to Christmas and the holidays, this time off that I’m enjoying at the moment, but also looking towards the next year and the work that’s coming up, getting mentally prepared for that, and getting myself in a good position for it.
Neil Hillman
Fabulous. Thanks for joining us today. Josh, if we could start with you, how did you come to be an on -set music editor? Did you start in post-production and decide to come out of working in darkened rooms, or was it a transition from already being part of the production sound department?
Josh Winslade
I actually came to be working as a Pro Tools music editor after a transition as a sound assistant. I’d studied music production at college and university. And in the lead up to COVID, I was working as a producer and music editor in radio. And then after COVID, I made a career change into the world of sound as a sound assistant. But through that, I found myself doing a lot of playback for mixes and excelling in that area because of kind of my unique knowledge in both production sound as an assistant and music production experience. I then joined Simon’s team on Snow White as a rehearsal playback operator, and after impressing him on that, he invited me to do main unit for Wicked.
Jason Nicholas
So, this is going to be a very technical episode for us, so let’s just dive right in. Josh, what is the Pro Tools rig on the set? And for that matter, why Pro Tools as opposed to say Ableton Live or something, which is specifically designed for freely playing out pre-recorded music and incorporating live performance.
Josh Winslade
I think, well, I think when it comes to a kind of which DAW you choose to use, in a lot of circumstances, it can come down to personal preference, especially when you’re making music or a live performance, things like that, it can come down to personal preference. When it comes to the film world, Pro Tools is very clearly the software of choice. It’s what every post-production house uses, it’s what people are used to seeing and doing final mixes with. Most producers, even if they produce in Ableton or Logic or Nuendo or whatever they’re using, most of the time will then go on and do a final mix in Pro Tools because of the way that it functions. Its whole basis in history is in supporting this industry. In many other vocations, you can have a bit of personal preference, but for me personally being able to hand off files to post-production that have come from Pro Tools, for them to be able to go back because every single session I playback at the end of the day is saved. So, for a music editor in post-production to have all of that information there and not only have stems that I’ve bounced out, stems that have been delivered with Simon’s rushes, but the ability to go back to my decisions and see every edit that I made on set, is just amazing for them. It’s a workflow that I think they really appreciate. So, in this specific instance, I think it’s the only choice for the job.
Jason Nicholas
And do you have any add-ons either in plugins or software within Pro Tools or hardware?
Josh Winslade
No, the goal was as little latency as possible, as little processing, as little rendering as possible. I can be playing back over 100 stems of music times with over 16 different mixes going on to different people, it’s stability, it’s never interrupting a take because Pro Tools has frozen or has decided to have a little bit of a moment. So, no, I keep it absolutely as clean as possible. If I have time to prepare and then render out, then kind of the main one that I’ll use is ‘Pitch and Time Pro’, just for any speeding up or slowing down, or any work like that; just because it’s so good and clean, I think it’s slightly less ‘artifacted’ than the built-in Pro Tools [effect], but apart from that, I really try and keep it as clean as I can, and obviously not having someone’s exact reverb or play or that they want when they’re monitoring live, it’s not going to be a studio record finish in terms of what they’d like to hear, but in terms of them listening to it and singing, I can create a pretty good replica with what’s available stock in Pro Tools; so it just leaves it clean, it gives it the best chance to run as smoothly as we wanted to. And I think that’s what happened. We had very, very few interruptions from Pro Tools itself. In fact, the whole rig performed fantastically throughout the whole shoot. I don’t think I can remember more than a handful of times where Pro Tools was the reason for interrupting the take.
Jason Nicholas
And I’m assuming you’re running on a fairly robust hardware system as well…
Josh Winslade
Yeah, so this was very much a personal preference choice, but I prefer Mac, genuinely because of the Mac scroll. It’s a feature that Pro Tools only has with a Mac laptop. There are ways to get around it with drivers in the background and a bit of coding. But on a Mac scroll, on a MacBook, the scroll pad, you can zoom in and out of the waveform in a way that you just can’t on Windows, and on a playback rig, you’ve got limited space. So rather than shift or control and scrolling with a mouse wheel, I very much got used to using the scroll pad. So, I kind of went with Mac early doors.
Simon Hayes
I mean, it’s worth me just jumping in here just for a second. We built this rig from scratch for Wicked. We knew exactly what we wanted to achieve on the set. And we built the Pro Tools rig, specifically for this movie, from scratch.
Josh Winslade
Yeah, so it’s a top of the line, eight-grand MacBook Pro basically, with 128 gigabytes of RAM. So more than enough for what we needed. And also, like I said, there’s very few live effects. Everything is rendered out. I’m not receiving MIDI from post-production, that’s kind of a stipulation. It’s rendered out stems, as little stress on the hardware as possible, but if it needs it, the hardware is actually very capable.
Neil Hillman
Simon, before we get into the listener’s technical questions, could you tell us how the job for Wicked came about? Had you worked with director John Chu before?
Simon Hayes
I hadn’t worked with John before, no. I worked with Mark Platt, the producer of Wicked. I met Mark on Rob Marshall’s movie, Mary Poppins. We got on like a house on fire. And then our paths crossed many times. We did Aladdin with Guy Ritchie. We met again on Rob Marshall’s next movie, The Little Mermaid. We were just finishing off Snow White actually when Mark asked me to meet John Chu. And that’s how it all came together.
Neil Hillman
Brilliant. Okay, let’s get started with a question. And this is from Adam Fletcher who wrote, “Hi Simon, it would be interesting to learn more about how your two rigs speak to each other, what you send Josh, what he sends to you. Is Josh recording your production sound mix on Pro Tools as well as feeding you and the in-ear monitor [IEM] mixes, and I would love to know more about the timecodes you have. Is it time of day on camera, but also maybe a timecode track on an ISO [isolated] track for the music? I know from attending your chat at the Sound Devices showcase earlier this year, you touched on this idea. And then Adam adds, “Thanks again for sharing your workflow so openly. You’re really helping the next generation of sound mixers to step-up their game early in their careers” – which you are – “and you’ve certainly played a massive part in my own education, which has allowed me to implement methods you’ve proved work, on projects of my own.” So great stuff there, Adam. And in a similar vein, Max Marchman writes, “What kind of IEMs were the actors wearing and how were they best concealed? And did they have their own effects mixes just for their personal monitoring?” Over to you…
Simon Hayes
I think this is a conversation between me and Josh, really. So, Josh, please feel free to jump in whenever you can add something to this. But the first thing is, it’s great to hear from Adam. He’s an up-and-coming young mixer in the UK and I’m really impressed with his work and it’s great to hear him ask these questions. So let me answer this question from Adam. The first thing is, how do we communicate with each other? Let’s start with timecode. There are two types of timecodes on a musical. There’s time of day timecode which is our master timecode which keeps the picture and the sound in sync with each other, and that is no different to whether you’re making a feature film, a television show, a commercial or whatever. We are generating on the production sound cart using an Ambient master clock, we’re generating time of daytime code and we are using that timecode to feed into our Sound Devices Scorpio [recorder] and we are also creating a Wi-Fi hub of that timecode, and every camera has an Ambient sync. box on it, which is getting a constant jam from that Wi-Fi timecode, time-of-day. And in the User Bits, there’s a roll number. The first day is a group of zeros and roll number one. And we’re also using a Digislate as a secondary reference. I really like to use, in fact, I should use the correct terminology, it’s a timecode slate. I really like to use a timecode slate as well so that if for instance something you know a rogue Teradek [a wireless picture monitor source and/or a wireless camera focus / iris control] is spitting out loads of RF, or something weird has happened on the camera sync. box that we don’t quite catch in time, we’ve got a secondary timecode reference which is that timecode slate in front of the camera. You can’t beat that, so I always try and persuade productions to have a timecode slate as well. But then, of course, the real question Adam’s asking is, how do we deal with the music timecode? And that’s very, very simple. Each music track on the show is assigned a timecode, and that timecode will be striped into Josh’s Pro Tools session. And when Josh is playing back his track to the actors, and also to me for my dailies, he’s also given me that timecode. So, I’m assigning that timecode to a specific, discrete ISO track, or stem you could call it, I guess. I prefer the term ISO track when we’re talking about production sound because I think it’s a more widely used term. So, I’m putting that mono timecode onto an ISO track of my Sound Devices Scorpio, so that throughout the post chain, whether we’re talking about picture editing, whether we’re talking about music editing, whether we’re talking about post sound editing, they have all got access to that timecode, on that ISO track, which links them indelibly to the Pro Tools session that Josh has played back. Regarding how we’re linking our carts together, we’re using MADI and the reason we’re using MADI is very simple. It goes further than Dante. So, we’re using Dante to get into MADI. And I think Josh can touch more on the exact how’s and where’s we do that. But before we go down that route, let me explain why MADI. It could be that I need to have Josh over 100 metres away from me. I try to never let that be the case. Really where I want Josh is four-foot to my left so that I can have instant chats with him, face to face, in between takes. But there could be situations, for instance, on The Little Mermaid where Pro Tools needed to be down a cliff, and I think we went about 300, 350, 400 metres with a run. And so that’s why we chose MADI initially. It was very, very helpful in some instances on Wicked. I think it’s a more robust communication system, although obviously Dante is fantastic and I use Dante for other things, which we’ll talk about later. But for Pro Tools, I think it’s great to have MADI as our primary system. So, Josh, do you want to tell us how we get into that MADI system?
Josh Winslade
Yeah, so everything revolves around the direct-out Prodigy MC, which is a modular audio converter that Simon actually picked up on a job beforehand. So, when we got into the pre-production on Wicked, me and Simon got together, and Simon was very passionate about building a rig specifically for Wicked. We knew the kind of challenge it was going to be. We knew there were going to be huge outdoor sets, and we knew the challenge that was going to pose. So, when Simon first came to me, it seemed so simple, but the first thing he thought and the first thing he said was, “I want it to be built on an URSTA cart.” And it really does sound so simple, but it just absolutely changed the game in terms of portability and being able to move around set quickly and with speed.
Simon Hayes
Can I Just interrupt you there, Josh, just for a second? Let me just interrupt you for a second, just for the listeners here. I’ve done a bunch of musicals with a bunch of really, really good Pro Tools music editors, but one thing that always frustrated me was portability of their carts. A Magliner [camera department trolley] with a load of Pro Tools equipment on it is not a one-person move. Now one of the things that I’ve always done, and for American listeners, you will recognize this term, you guys call it a Euro cart. Well, a Euro cart is how UK production sound mixers have always worked. And I’ve always used a Euro cart. I don’t like everything to be boxed on my cart. I don’t want it to be so big that it’s a four person lift to get it up a set of stairs. I want to be able to get my cart up a small set of stairs on my own and I want a big set of stairs to be navigated just by me and one other person; and I’ll tell you why guys, because if I’ve got a cart which is all boxed with transit cases, it looks fantastic but when it comes to get up those stairs quick I’ll tell you what the answer will be – we can’t get up the stairs, we have to stay here, which will potentially put me further away from the actors, further away from the director. And so that’s why I’ve always gone with a quite stripped-down Euro cart that’s reasonably light. And what I wanted to do when Josh and I started talking about building the Pro Tools rig for Wicked was I wanted him to be on exactly the same cart as me so that it was portable so that when we needed to move fast, Josh could move that cart on his own, and I wasn’t having to say to a bunch of second and third AS’s, can you stop moving that JBL PA rig and help Josh with his Pro Tools cart? Really, I wanted him to be as self-sufficient as possible.
Josh Winslade
Also, a lot of audio equipment, especially the equipment we’re using, is rack mountable. So, the Ursa cart kind of, it lends itself to that as well. It’s very easy to mod it in a way that I can slot in different pieces of audio equipment that we’re using, and it’s safe, it’s secure, it’s fixed in the way that it needs to be. So that was the very first thought of the build. The next bit was whether we were staying in MADI, and the Dante, and this world that they’d built on Aladdin and The Little Mermaid. And Simon had two Prodigy MCs sat there basically, and they used it in a bit of a different way that have one on each end, and it was to send a feed between two locations on massively long runs that they were doing in places like Sardinia. And we decided to use it completely differently and just use it as a converter. So, the Prodigy MC, it’s modular. You can get different modules for it. And basically, we put two MADI modules in it. We basically butchered both. I took all the modules from the second one and stuck it in the first one. And that gave me 64 analogue ins and outs, it gave me 128 MADI ins and outs, and it gave me that conversion between MADI and analogue as well. It gave me the ability to use MADI runs. That cable, Simon had 300 metres, 400 metres, two 500-metre drums and a 400-metre drum of military grade, MADI fibre-optic cable. And you’re not just going to go down a different route when you have all of that available to you for a job. So, everything kind of centres around the Prodigy. This thing is so stable. Its normal job is concerts. So, you know, massive, massive, massive concerts with audio feeds going everywhere and the quickest kind of conversion you can think of. We then also had Tom Barrow. Tom Barrow is Simon’s second unit production sound mixer, but he’s so much more than that. He has a massive history as a sound engineer, and as a musician himself. His ability is phenomenal. He wired me patch bays, custom built patch bays for the front, for all of our analogue ins and outs and yeah, so, at the centre, you have the Prodigy MC. We go from the laptop, we’ve got Pro Tools. In terms of software, we’re running Pro Tools. Pro Tools then goes to an RME Babyface-Pro. So, what’s directly interfacing with the computer is a Babyface-Pro. Now, that is a very small interface. It’s got two analogue ins, two analogue outs. It’s got a USB slot on it. It’s class compliant, so it can be run on USB. It doesn’t need a power source. And then it’s got headphones in and out. And that’s it. A tiny little interface that sits on the rig. It also, however, has MADI in and out. So, then we connect via MADI to the Prodigy. The Prodigy handles all the conversion between analogue and MADI, so that gives me all those analogue ins and outs. It also has a second MADI module in it that is then connected to Simon’s cart. So, it runs from that to a direct-out box, which is a MADI to Dante converter. You can get a module for the Prodigy that does Dante directly, so we could have got rid of that box, but at the time when we were building it, when we went to look to purchase it, it was during the great chip shortage, so it was absolutely impossible to find. Simon already had a MADI to Dante conversion box sat on his cart anyway, so rather than trying to just streamline that and make the cart, you know, streamline that process, we just stayed with what Simon already had. So that box then converts 64 channels of MADI to Dante, which Simon then, with the laptop Dante controller he has at the top of his cart, can control what goes in on what channel on his cart. Receiving back from Simon, I take all 32 of his tracks, all of them. So, they come back down through the Xbox Converter, through the Prodigy, through the RME Babyface that sat on my rig into my computer, which was phenomenal. That meant at any time I could PFL a boom or a radio track, put a radio mic into Ari’s ear, stick a bit of reverb on it so she could hear herself while she was singing. I could send anything, anywhere basically, at any point. Simon could go, “Hey look, don’t send her her radio feed, because at this point there’s gonna be big wind and that will blow her ear up. We don’t want that, so tell you what, take my left mix instead.” And then I know what’s going into Ari’s ear is then being mixed by Simon as well. So, Simon personally mixed what she’s hearing in her ear. So that would be a scenario that could come up sometimes as well. So having all of those tracks come back to me was extremely useful. Even, you know, say we’re on a scene where there wasn’t necessarily any playback; Simon’s got 12 radio mics and three booms that he’s recording, and Simon might say to me, “Hey, Josh, do you mind listening into Ari’s second radio?” Because he’s got so much going on and just saying, “Can you tell me at the end of the take if that’s clean?” And at the end of the take, I’d be able to say, “Hey, no, that radio was really clean.” Very rarely, rarely happened. But for example, that is something that could be done with the setup that we had. Everything that I mixed wasn’t mixed in Pro Tools. Everything was mixed in TotalMix. So that’s the other benefit of going with RME [the Babyface Pro] is that you get access to their hardware-enabled TotalMix software, which is the most amazing piece of software I’ve ever used in terms of mixing a live event. I worked in live events as well for two years. And TotalMix was just absolutely phenomenal. Going back again to being able to keep Pro Tools to a minimum and everything in total mix. The way to think about it really is that Pro Tools was automation. If I was coming out of speakers on set for two bars, and then going into Ari’s earpiece for three bars, then back out the speakers, and then jumping between on a complicated take various different playback methods, all that automation happened in Pro Tools. All the mixing happened in TotalMix. So, what TotalMix does is it gives you all of your inputs, everything from the Prodigy, it reads them. So, every single input, whether it’s MADI or an analogue input, it gives me as an option. It then gives me all of my outputs as software outputs, a virtual output, so to speak. And then it gives me all my physical outputs. So, what that gives me is a secondary sub-mix. And then it has a lot of features in terms of snapshots where I can save mixes and jump between different mixes live during a take, seamlessly. It gives me the ability to add compression, to add two iterations of reverb if I wanted, which was just enough for cast one and two, without having to do it in Pro Tools; it would give me the ability to play back from any other piece of software on the MacBook that wasn’t Pro Tools, and mix it into playback as well. So, for example, Jeff [Goldblum] in-between takes would like music in his ear and I’d be able to have a music playback software open in the background, and he would give me a playlist, and say “look in between takes, I want to hear this”; so I would take that as an audio software playback within TotalMix.
Jason Nicholas
What kind of music does Jeff Goldblum like?
Josh Winslade
He likes a lot of jazz in between takes. Lots of Aretha Franklin as well. He’s a very old soul and an amazing, amazing person to work with. I introduced myself to him, or he, that’s a lie, he introduced himself to me, he came around and introduced himself to me and Simon, we’re always next to each other, me and Simon, if we can help it. At the beginning he came to me and said, “Look, I want to be able to talk to you and request things in between takes.” And I was like, “That’s okay. I have the ability to put your radio into my mix”. As a rule we don’t do that at all, you know the only time you can hear an artist a lot of the time is when Simon fades them up and that’s a respect thing, it’s a professional thing, that’s something that you know most sound mixers and sound assistants are aware of: it’s respecting that privacy when someone has an audio device that’s transmitting sound attached to them constantly, and Simon is very big on the respect side of that as well; so I said to him [Jeff Goldblum], “I’m able to do that. As long as you’re comfortable with that, I can have your radio up in my headphone feed and I can listen out and you can talk to me in between takes and you can tell me what you want to listen to.” And he’s like, “I would love that.” And in between takes, I’d get, “Hey, Josh, play some of that sweet, sweet nectar for me. I want to get in the zone and give me some of that.” And he just, you know, I can see him on camera on my monitor on the playback, bopping his head listening away to his music. And that, again, is just another thing that this rig allowed us to do. Any request that anybody had, we could do. We could do it within 30 seconds and less. And that was the main goal at the beginning. Being able to cater that rig to that. You know, I made very little changes to that rig throughout the whole production because it was very good to go almost from the offset. So, at its heart is that Prodigy. We have complete communication back and forth between me and Simon, including comms as well. So, we would integrate my talkback system that’s built into TotalMix. So, I have a pair of HMD25s, (sixes?) [Sennheiser headphones] I’m using them now, just because I thought it would be nice to be able to move my head around whilst I’m talking and not worry about going off-mic. And the talkback button on my desk, everything functioned through TotalMix and Simon’s able to take that, pull it from the Dante network, and integrate it into his communications system with his team, which is so important. The person that I interact with the most on a daily basis, apart from Simon, is Arthur, Arthur Fenn, Simon’s key first assistant sound; because the way that Simon likes to run playback on a set is not me sat on the end of a radio listening to a first assistant director on a channel that has three million things going on, and someone’s coffee’s arrived, and missing something, or you know, it’s a radio so it’s not quite the clearest, and depending on triggers and cues from that, everything went through Arthur Fenn.
Simon Hayes
Yeah, Arthur runs the floor. Arthur runs the floor guys, he’s there, shoulder to shoulder with the director and the first assistant, he knows exactly what’s achievable and what can be delivered from a sound perspective. So rather than having Josh interacting on Channel 1 of the Motorola’s [on-set Production walkie-talkies] where, you know, the whole communication system for the movie is happening, we’ve got Arthur filtering out stuff that Josh doesn’t need to hear and doesn’t need to be involved in, and having a direct conduit of information and dialogue between myself, Josh, the first assistant, John Chu, Arthur on comms, and that’s really… You know, we call Arthur a key first assistant. That’s key, it’s absolutely key, that that communication system is immediate; and also that if a first assistant director asks for something that Arthur can basically translate into more audio friendly language, more music-friendly language, then he will do that, just to make sure that we can have more of an immediate response to what’s being requested.
Jason Nicholas
Josh, what’s your control surface for TotalMix?
Josh Winslade
Yeah, so I use the Icon Pro, the G2. Right. It has a bunch of different protocols, Mackie is one of them and TotalMix will take Mackie, and I decided, because I knew I was mixing in TotalMix instead of Pro Tools, and leaving that to do the automation of things, I decided to have a control surface for TotalMix, where the mix was happening, and the integration out of the box was about 800% better than I thought it was going to be. I was genuinely blown away. I thought I was gonna be sat there for a couple of days doing custom macros and getting it to work, but the integration between the Icon G2 and TotalMix was 90% there, straight out the box. And the amazing thing about it is I use a few other different pieces of software on set as well, which is kind of like a MIDI-triggered sound effects playback software that you can buy. The nice thing about it is that it runs in the background, it automatically boots on set. So, if I’m trying to hype dancers up and I want to throw a bit of General Levy’s Wicked through the speakers, because I can hear the first AD going like, “Come on guys, a bit more amped pitch”, I can do a few DJ horns and play a bit of Wicked. I don’t want any of that to run through Pro Tools or anything like that at all. So that’s running in the background, MIDI locked to a bank of six buttons I’ve got on the Icon G2 as well. So, it can simultaneously control TotalMix with those six buttons being independent for another piece of software. So that was amazing. I was able to live mix and almost DJ three different pieces of software with the one control surface, which I really wasn’t expecting it to be able to do as well as it did, and it was absolutely amazing. And then the other piece of software that I use is a piece of software called, and just bear with me, Keyboard Maestro. And Keyboard Maestro would just handle everything else, everything else between any other piece of software, it can run custom scripts, it can just do anything I want. So, for example, my playback markers in Pro Tools, you know, the shortcut for number one is dot one dot, and that will jump you to number one, and then enter, and then it will play back. So, dot one dot, enter. So, I could write that macro into Keyboard Maestro and assign it to a button on the mixing desk. So, the first 10 playback cues within Pro Tools I never used and I asked the music editor in the studios, that was the post-production music editor, audio engineer, Robin Bainton, you know, please don’t use the first 10 playback markers because I had 10 keys on my G2 that would specifically do that. And so, throughout the day, I’d assign those playback markers to those different buttons, so I didn’t even have to hit the spacebar. So that came really good on a number of songs, but in particular, Popular. I think Popular was the, I mean, every song was complicated in terms of whether it’s a range issue or, you know, the scale and size of the set or the technical kind of music playback side of it. But Popular was particularly…
Jason Nicholas
With just the two… That’s just with the two of them, isn’t it?
Josh Winslade
Popular is Ari. And again, yeah, interestingly, I didn’t expect that either. But there were eight different contraptions, and these were mechanical contraptions that would go off and be designed that would specifically have to be triggered at certain points. And there were, there are, dialogue breaks in Popular. And so, I had, I think the intro had six different playback cues alone, alone, just on the intro to Popular. That’s me, Arthur Fenn and Ari working together to kind of figure out, okay, so does an action Ari does, cue that? And then she gets a couple of clicks. So, because it’s about first of all pulling me into time to play back, and then second of all pulling Ari into time, someone has to follow someone. I either follow her or she follows me. It would work a number of different ways on that song in particular, just for different reasons like a mechanical effect; or the director saying “I don’t want that note to play until this machine hits the top”, or something along those lines. So having that controller there and knowing where those first six cues were, staring at a monitor, waiting for feeds and being able to just cue each one simultaneously, was really, really useful.
Simon Hayes
Um, Josh, can I stop you there for a second, because this is all really exciting and interesting stuff, but I just want to take a moment where we can just all ruminate on something important. This is why, firstly, in my Beyond the Sound podcast on YouTube, I talked about your cart. And because we were talking about the creative aspect of the film, I just said Josh’s cart was turbocharged. And that was it. Now, it’s just taken you probably 20, 25 minutes to talk about how we turbocharged that cart. So, you know, what I’m trying to say, guys, when you’re listening to this is, this is not a normal Pro Tools rig. This isn’t a Mac sitting on top of a Magliner. You know, this is a cart which has been specifically built for the challenge of Wicked. And the way that we informed ourselves on how to build this cart was through a multitude of musical films. And we’ve basically taken elements that we’ve learnt from a bunch of absolutely fantastic music editors along the way. And just to talk about that for a second, I really dislike the term ‘Pro Tools operator’. And I think when you listen to what Josh has just been talking about, you can understand why I far prefer the term ‘music editor’. And if we must talk about the software that he’s using, a Pro Tools music editor. This is not an operator. Operator minimizes the creative input that Josh has on the movie set. It almost insinuates that it’s a purely technical role. It’s not. It’s an incredibly creative role and an incredibly important role. And throughout my time in musicals, I’ve always recognized this. And from the very start when I did Mamma Mia, which was my first big musical, I set out to find the very best music editor that I could get on the set. And the place where I kept on going back to, to find music editors, was to Abbey Road Studios. Rob Houston, who did Mamma Mia with me, came from Abbey Road. He then came on to Les Misérables with me. After that we came across Victor Chagger, again, you know, a stalwart from Abbey Road Studios, and Victor’s now a music supervisor in his own right. But a lot of what we’re talking about here, about the rig, are ideas that we’ve learned from Victor Chagger and from Rob Houston previously. And we’ve basically taken all of those ideas and in a no-expense-spared fashion, built a cart which is able to do whatever we require. But that cart is just a musical instrument. It needs a musician to play it. And that musician is not someone that knows how Pro Tools runs because they’ve made a few tunes in their bedroom, or they’ve done a little bit of playback of dialogue on a film set at some point using the Pro Tools software. This is someone that can use Pro Tools as if it’s a musical instrument and is absolutely au-fait on this software and can use it as if it’s second nature and not think about it. Literally as a request comes in, it’s immediately just dealt with, fast. And I think as an industry, we need to recognise that these individuals have a huge creative input on what we’re doing. And we shouldn’t minimise this job title by calling these people playback operators. Back to you, Josh.
Josh Winslade
I really appreciate that, Simon. And I think that comes more into play as well. I think we are talking quite technically at the moment, but when you get into the music side of it, when a director, when the director, when John turns to Arthur and goes, “this is too heavy, I want it to sound a bit softer”, especially with something like Wicked where the music already exists, and you know, you’re one-for-one matching exactly what already exists; but you’re trying to create a vibe on set and the director has a vision and how do you edit that music so it stays true, but also gives the director what he wants? And that would happen a couple of times. But also, in a timing sense of, this person’s landed here and they’ve started singing and John will go, “No, that’s not right. She needs to be here by the time she sings, but we can’t jump and da -da -da -da -da. Josh, I need 30 seconds of fill here.”
Simon Hayes
Just like Popular, Josh, where Ari’s coming in and out of spoken dialogue and back into singing. That’s not something that was rehearsed, or we knew that was gonna happen. That’s Josh and Ari and John Chu and Arthur riffing on the set and working out as Ari makes her way around that room, how she’s going to react to the music, and where she’s going to want to take a moment. The Pro Tools playback is the other character in the room. That Pro Tools playback is creating the timing, and if it can’t be fluid and intuitive, then it’s actually restricting the actor’s ability to perform an act based on what they want to do, and actually forcing them into a tempo that they don’t want to be locked into. What we need is, although we must respect the tempo, we need to give the actors room to act.
Josh Winslade
Yeah, absolutely. And at the end of the day, my job is to support those actors to the best of my ability, to give them what they need to be able to perform. Because what they’re doing is extremely difficult. And especially when it comes to Ariana and Cynthia, they’re both professional musicians and they absolutely know what’s achievable. There was no version of this where we could just say to them, here’s the playback, you know, and them not be aware, well, I want four clicks going into this, or I want to slow down here, or I want it to speed up here, or, you know, they’re going gonna notice those things and want those things straight away. Like, “Why can’t I have reverb in my ear? I do when I’m on stage, and that’s how I’m used to singing. That’s how I need to be supported.” These aren’t things they said, these are things that are assumed, because we want to support them in the best way.
Simon Hayes
Neil and Jason, there’s one person, there’s one person on that set who’s as good on Pro Tools as Josh – and that’s Arianna Grande.
Neil Hillman
Yeah, we’d heard that! That’s the question that we were going to ask! So, you had to be on your top game there…
Josh Winslade
Yeah. Day one on set during rehearsals, we’re rehearsing, we’re actually rehearsing Popular, funnily enough. And it’s the first time I’d met Ari, and it’s the first time the rig had been to a rehearsal, and so I turn up, set up, and even on a rehearsal we’ve got four massive JBLs on each corner of the room, so they can rehearse in a way that’s going to be similar on the day. I sit down and Ari’s rehearsing, I’ve not been introduced to her yet. And then she’s going through it. It comes to the first time we do playback, and I play back, and it comes out and it’s very present. It’s impressive. We always want to impress, and she turns around. She’s like, “Oh my god! Where did that come from?” and then she turns around and she sees the rig and she comes over. She’s like, “Oh my god! This is like a portable Pro Tools studio, like this is absolutely amazing. Can I have a look?” and she sits down next to me and my god, she just starts flying around this session and just going like, all over the project, looking at things, looking at her stems, looking at the music stems. I’m just like, wow! So I kind of knew then how capable she was. And then she’s asking questions, she’s like, “Oh, how do you do this and how have you done this? And what am I going to be able to hear?” And she’s like, “Oh, that’s amazing.” She’s like, “Oh, yeah, I like it like that.” And so, yeah, from day one, we knew that she was very aware.
Jason Nicholas
So had you fallen ill one day, she could have stepped in and…
Josh Winslade
Yeah, but I’m not allowed to be ill…
Simon Hayes
In fact, I’m going to take her on to the next project…
Josh Winslade
I think I absolutely was not allowed to be ill, ever. If it was a playback day, it’s kind of, I had to be there. You don’t want to let down production in that way.
Jason Nicholas
Well, I mean, speaking of playback, you provided a low-frequency ‘thump-track’ for people on set, and that gives artists the tempo, but it’s sufficiently low-frequency that it could be filtered out in post-production. How did you arrange PA for the artists?
Simon Hayes
So, despite the fact that Wicked is sung live, we also have to have a huge PA at all times. And that’s because we’re weaving in and out of live vocals. We are, you know, when we go into heavy, heavy choreography, then we’re supporting that choreography with a massive amount of firepower out of our PA. So, there’s a couple of different things to talk about here, and again this kind of comes from experience of working on musicals for many years, across many different genres. One of the things that we do is we have a separate PA for music and Voice of God [speakers to send instructions to the set]. What we discovered very early on is that we need to get the PA speakers which are supplying music for the dancers as close as possible from a rhythm perspective. We don’t want to have any delay caused by the speakers being too far away from them, but that also means that those speakers, they’re going to be close to our actors, they’re going to be close to our dancers. And when we have a first assistant who is geeing the crew up and asking how long something needs to take before we can shoot, those artists do not need to have that voice hitting them as they’re preparing themselves for a take creatively, and getting in the right mindset to perform. And so, we kind of have two layers of PA. We’ve got the PA which is going to play back the music for our performers, and then we have the PA which is going to be firing at the grips, the camera team, the sound team, for the first assistant to be to be giving them instructions. And also, you know, if John Chu wants to talk to them, we don’t want that voice to be absolutely loud in the same way as the music will be when they start performing. We want there to just be a little bit of distance between that instruction and the artists themselves. So, you know, we have push-to-talk Sennheiser microphones, which John [Chu] has and Jack Ravenscroft, our first AD had. We’ll also give one to Chris Scott, the choreographer. You know, any sort of HOD [head of department] who’s having a creative input on what’s going on on-set, is going to have access to a push-to-talk. So, what are we doing? I guess we were probably putting on average about 8kW of sound through the Voice of God, and probably on average of about 24kW through the PA for the for the dancers.
Josh Winslade
It would be one line from me that would be run out by Taz, Taz Fairbanks, the second assistant sound, to a mixer [desk] that was on a VOG [voice of god] trolley, essentially that’s we called it, that’s why Simon’s making the distinction between the VOG speakers and the music playback speakers, and she would do two different systems and put those speakers where she wanted and needed them, but they were completely separate. And then, also, she had a final master control for the level of the playback so that if micro adjustments needed to be made without kind of coming to me or and needing it to be done on the fly, you also had a secondary mixer there as well that was able to do that. And you know, we had a bunch of subwoofers as well.
Simon Hayes
We used thumper tracks reasonably regularly on Wicked and the reason why we did that was because we had 75 earwigs [small earpieces] and sometimes we had 140, 150 background villagers, encompassing dancers and supporting artists, and we needed to keep them in time; and so what we would do with those guys was during rehearsals Ben Holder who’s our wonderful music associate, who was very much a part of our wider sound and music team on the set, would run those rehearsals. He would work out who the best singers were, and we would give the best singers the earwigs; and we’d use an induction loop system for those earwigs, which we’ll talk more about later when we get onto the subject of IEMs generally. But we’d give 75 earwigs out to the best singers and then, let’s say we’ve got 150 people total, the other 75 would be kept in time by two factors. The first factor would be the best singers singing. So, the other 75 would join in with that chant being led by the 75 that are wearing earwigs, but we would also produce a tempo through a thumper track using subwoofers, which would basically be that driving bass. And what we do with those subwoofers is we turn them up absolutely as loud as we possibly can before the set starts rattling. Because the moment we start to get secondary rattles, we know that when our colleagues in Sound Post, when Nancy Nugent-Title, John Marcus and the whole team over in Sound Post start to try and remove those sub frequencies, it’s no good if we’ve also got mid-range frequencies from bits of plywood rattling on the set, because we’ve turned it up too loud. So generally, what happens is I’ll take my headphones off and walk to the set and say on the comms, you know, “Josh, turn it up, turn it up, turn it up”. And the moment I start to hear secondary rattles, I say, “right, back it off 6dB”. You know, that’s where we are. That’s as loud as we can go with the sub. And I think, Josh, what was I asking the sub to be at? I think about 45 Hertz or something. Where was it exactly that I asked you to put it at?
Josh Winslade
At 37 Hertz. Because a hard curve would resonate too hard on the other speakers, not the subwoofers. The subwoofers could handle it, but this gave us the ability to use the other speakers as well, if necessary, for the thump track. but no, 37. [Hertz]
Simon Hayes
And the reason why we’re at that 37 Hertz is, it’s very, very simple. It’s as low as we can possibly go to give Nancy and John Marcus in sound post the ability to filter those frequencies out really easily. We don’t want anything up at 50 Hertz, which is going to start having secondary resonance into the vocal range. So, we go as low as we possibly can, but not so low that people can’t hear it or that our speakers struggle to reproduce it. And so that’s the sweet spot for the JBL rig that we use. And as Josh says, we’ve got a whole bunch of JBL subwoofers, but as well as the subwoofers, we’re also putting this thump of track through our full range speakers. And the reason why we can do that is because our full range speakers are proper full range speakers, that have got huge, huge bass drivers in them and they will reproduce, you know, down to, they’re 20Hz to 20 kHz. We’re not using some kind of, you know, smaller speaker. Over the fullness of time, what I’ve learned is that there’s no shortcuts here; and that my PA system is very, very important. And so, we’re using a JBL PA, where the subwoofers can have support from the full-range speakers. And 37Hz just seems to be the sweet spot, doesn’t it, Josh?
Josh Winslade
Yeah, absolutely. And technically, the way that was produced was just a side chain to the click track. So, then any edits I made during the session, the thump track was just always there, always available. And we could go at a moment’s notice…
Simon Hayes
And what that means, Jason and Neil, for the listeners, is it means that we can basically record live chants of choruses and also the movement of the people, and that we’re not always having to rely on a playback track, that we can basically get elements of live. Now, I’m not saying that those live recordings were used purely on Wicked because what John Chu wants is for those choruses to sound absolutely fantastic and so I’m sure that you’ve all seen footage of ensembles in the studio singing those choruses, but what we’ve got for Nancy Nugent-Title and John Marquez [dialogue editors] and Jack Dolman, our fantastic music supervisor who, you know, we integrated with, we were all one team on this show, and what we’ve got for them is we’ve got a base layer of recordings that they can choose to use or not choose to use, and they can add the ensemble loop group recordings, but if they want to have a reality of that chant, or those footsteps, from the set, what we’ve provided is a clean base layer for them to create a blend. And what this is all about, this is not about whether we’re saying, okay, we were 100% live or, you know, all the choruses were 100% re-recorded or pre-recorded. What we’re doing is we’re basically creating choices for a blend to be provided, that gives John Chu the very, very best soundtrack possible, which creates the best immersive and believable reality for the cinema audience in the theatre. And there wasn’t one workflow across the whole of Wicked. It was more a case of, you know, giving Andy Nelson and John Marquez [re-recording mixers], when they’re sitting at the re-recording desk in the final mix, the ability to really find that sweet spot on every track. Well, without needing to manipulate it too much, as well.
Josh Winslade
So, to give a very specific example, there’s a sequence during What Is This Feeling? where the dancers are using old school desks, you know, school desks with the wooden lids that would open. And they have a twofold system that’s built into the desk. And Chris, the choreographer, did this amazing routine and there’s, you know, 80 desks set out with 80 dancers there and they jump up on the desk and it’s like Stomp, if you’ve ever seen Stomp, the performance is slamming the desk, feet and on all of this, and you know it’s one thing to be able to go there and get a wild track of that and say to post “here you go, here’s a wild track” that you know 60 seconds in is off-tempo and you’re going to have to slip and slide and manipulate and fix. It’s another thing to then have a thump track going at the same time and you’ve kept them in rhythm. So, you’re able to give to post-production the whole thing; it’s in sync. at the beginning and it’s in sync at the end. And where the tempo changes, it changes at the same time because those dancers have had that thump in the background running for them. It just gives post a few less steps to have to do, to go back and fix and sync. and slide and do all those bits. It’s just ready for them to use if they want to, and get that authentic sound of what it sounded like on set. And even if it is just a sync. track that they’ve used it for, which I don’t believe it was, even if it is just a sync. track, it’s a damn good sync. track because it’s in time. It’s actually all in time.
Jason Nicholas
I was just going to ask, with the amount of footfall and practical percussion that there was in Wicked, how did you mic up those ensemble or sort of choral groups and not just be overwhelmed by feet?
Simon Hayes
Whenever possible, we had carpet out. I’ve got a whole bunch of photographs, actually, on my social media, massive amounts. Hundreds and hundreds of pieces of carpet out. And we’ve got Taz Fairbanks, our second assistant, sound / sound co-ordinator. You know, look, if I brought Taz on to do a podcast with you guys, it would sound just as complex as Josh talking about Pro Tools. These are all people that are really at the top of their game. You can’t take this conversation and this crew and liken it to anything that happens on a normal movie set. What we did, it wasn’t the normal movie set. And I think that everyone really, really did bring their ‘A’ game. And Taz certainly had a bunch of people helping her and they’re basically watching monitors on every take. And the moment they go, “okay, hang on, camera C has gone onto a tight lens here. That means we can get another 16 bits of carpet in to reduce this footfall”. We were doing that. So, one of the best ways of reducing footfall is by doing it on the set with carpet. And so, we did that at all times. And then they’d remove it for the next take, and put it back for the next take, and remove it for the end take. Yeah. And you know, and this is the kind a thing where, when people are watching us on a film set and they don’t know about sound, they think that we’ve lost our minds; but when our colleagues in Sound Post hear what we’ve done, it’s very, very clear that we’ve done this and it’s had a tangible effect on how much usable sound there is, and how good that sounds; and to talk about how we might, you know, we might do everything appropriately for what we saw on screen; and that means using multiple booms, stereo booms, lavs, you know, there isn’t a one-size-fits-all. All I can tell you is that the Sound Devices Scorpio has 32 tracks, and I’ve got 30 plus years-worth of microphones in my box; and whatever I’m presented with visually, I’m going to do the best mix that I possibly can, using the correct mics for the job.
Neil Hillman
There’s a question that Simon Norman has written in with. I think the nub of it is about logging what you’ve done and sharing that information with post-production. So, his question is, “Did you guys have to handle any requests to change timing for music, for instance, if there was a pause in the music for a dialogue line and the actors wanted to add an extra beat for a look or an ad lib”… I think you’ve covered that already, Josh, and “Did Josh have to alter the Pro Tools files to accommodate this?” And most importantly, “how was this logged for post-production?” That’s really the crux of this question.
Simon Hayes
Let me start off by saying it’s really nice to hear a question from Simon Norman, another young sound mixer who I’ve got an awful lot of respect for; but if I can be glib Simon, when you say did this happen? Yes, every 10 minutes, every single day for 155 days.
Josh Winslade
Yeah, so I think the way Simon [Norman] has kind of phrased the question is the idea of preserving those stems within Pro Tools and then how we would, you know, report that back to post-production, and there’s a few points that that happens, but Simon [Hayes] is right, every single day there’s micro edits and things that were done to just clean things up, and those requests don’t just come from on-set; as well, you know, you have Stephen Arimas and Dominick Amendum on set from the music department, every single day requesting small changes be made, you know. You might get a new version of a song the night before, because, you’re constantly adapting to changes on set. When we did No One Mourns the Wicked, there were heavy, heavy edits to that song, because the whole way that they were going to shoot it changed in the moment. So basically, within five minutes, we had to edit two different versions of the song together to make a third version and we’re sat in the back of a van out in Ivinghoe, me and Dominick Amendum editing this music. So massive changes were done, and micro changes, every single day. The way that’s recorded, there’s three or four different ways that it’s recorded. So, first way it’s recorded is that all the stems are sent to Simon, so it’s recorded – you can listen to it in the rushes that Simon sends, and that’s why that is very important. We should talk about that more later, as well. Yeah. So, it’s recorded in that sense. I send a certain amount of stems separated down to Simon. So, it’s recorded there. As well as those stems, I’m sending an LTC [timecode] track, so that’s the timecode that I’ve generated. Basically, with a song, we look at the amount, and the songs we’ve got, and I assign a time code, completely separate to the Pro Tools timecode, or the Time-of-Day timecode. A song gets assigned its unique timecode once it’s been delivered, and, okay, that’s the time code, that’s the song. Okay, so this is the timecode. And what that’s doing is it’s saying, this song, when this note at this time happens, this is the bit of timecode that goes with that. So that’s with that. That’s for post-production. Avid can take an auxiliary input for a secondary timecode. That is what they would load up and read from, so that it syncs. all the music. And that’s how that would work. The third place it’s recorded is in my stems that I bounce out at the end of the day. So, I would bounce out Ensemble as a separate stem. I’d bounce out a list of about 11 different stems that was part of a process for post that always depended on how much detail they wanted to go into in the assembly edit, which we will get into later, because it’s another one of the reasons that I send what I send to Simon. But basically, if they don’t have enough choice in what I send to Simon in post for the assembly edit, not a final edit or anything like that, then they can dip into my stems, that I’ve bounced out, and that will give them more variety, a more detailed collection of stems. Fourth place that it’s recorded is in my session, which is saved every single day. Every major edit I would tend to save as well, giving it a very clear name about why we did what we did. So, “remove two bars for time to Ari Foot On Stairs”, you know, “looped two bars at bar 16 for this”, “slowed down” and, you know, very detailed project save names, basically, that would go into notes. And the last place it would be recorded, which I think is the main thing that Simon was asking in terms of notation, is there was an assistant music associate called Bronwyn Chan, who was later replaced on the film, just to give credit to Millie Ackerman, because she [Bronwyn] had another project, who would sit there; as, you know, the music department, as well as me being close to Simon, the music department always sat very close to us, because they would come to me with changes and edits and things. This is a really important point, that the music associate started off with Bronwyn and secondarily became Millie.
Simon Hayes
It was just a different way to what we did on Les Misérables, where there was a continuity assistant, a script supervisor’s assistant, who had studied music and was a high level musician herself, and who was assigned to us at all times, who would do exactly the same thing; which is basically… from a film point of view, guys, if you haven’t done a musical but you’re in the film industry, think about it like this. It’s a script supervisor attached purely to the music department, to make script notes based on music changes and continuity notes based on what sound music are doing.
Josh Winslade
Yes, so she’d [Bronwyn] have her end of day report and at the end of the day, she would come to me and most of the time she’s not even asking in the same way that an assistant supervisor at the beginning of the take is logging a timecode. I have a timecode reader on my rig as well. That’s always giving the readout of timecode that I’m playing back. It’s my first visual cue that my machine’s not playing back if that timecode’s not rolling. And it’s big – it’s a big timecode readout bar that sat at the top of the rig – and another element we had to add to the rig. Automatically in the background she’d be taking down notes for edits, she’d be asking where I was playing back from at certain points. She was also already building stacks in Pro Tools for the music editors, from Simon’s takes. If there were days where there was no playback, there weren’t many, but if there was a day in a week where there was no playback, she would be pre-building stacks in Pro Tools for the music editors already, and going through and making those cuts where we cut, essentially, so that it was already ingested there for the music editors. So those are kind of the five or six different places that it was recorded, so a lot of places. The most important one being the timecode, but obviously as soon as it gets really complicated and you’re breaking the timecode and you’re removing the timecode, it then becomes that once you get to post you need to acknowledge where those cuts are, because you have to re-sync and so on and so forth. So, with Bronwyn then making those notes and then also stacking and syncing on set, it was this phenomenal process where this music is able to come in and go, okay, here we go. It’s kind of there, they’re ready. But yeah, the timecode we generate, it was always played back to Simon, and it was the DNA lifeblood of the music. Any edit I made to a stem, and say I had a hundred stems in the session, if I inserted time or changed anything, I would also change that timecode sausage, which we like to call them, they’re not a stem, but that timecode stem, it would happen to the stem as well. And then there were certain situations where it didn’t really matter what you did. Like if you’re looping time, it doesn’t matter if you cleared it or looped it, it was just a reference.
Simon Hayes
And don’t forget that this timecode is also cueing lights as well. We haven’t covered that, have we?
Josh Winslade
No. So lights and some practical effects as well. So, the massive wheel that you see Fiyero running around and moving, and with dancers on, that was all triggered by timecode. Same with the lights, you know, in every single track, we’ve got lighting cues which are cued by timecode. So, we’re sending it. John Chu wanted these lights to occur, and Alice [Brooks – the cinematographer] wanted these lights to occur, at very specific moments in the song. So, you know, when there’s the trill of “what is this feeling?” and Cynthia opens her eyes like that, you know, “what is this feeling so sudden and new?”, then they wanted that purple light to come on and shine over her face. And it’s, you know, it’s one thing having a monitor in front of an assistant who’s then able to cue a light, and it’s another just being able to give one XLR to a lighting desk and they’re able to programme it so that every single time it would happen exactly when it needed to.
Simon Hayes
And for people that work in live, for instance, Dancing on Ice or whatever, this is going to sound completely normal, and it is normal, but it’s not normal in the film world. And so, what we’ve really tried to do is to take all of these elements from different genres that excel in different areas, and combine it all into what we could provide for Wicked.
Josh Winslade
I mean, it’s a great point; because the job it’s the most similar to would be a Sound-1 in a theatre, you know, it’s that element of cueing and the timecode going out there and it has a lot of similarities with that. So yeah, we’d cued those lights and also some practical effects, the wheels for certain moves and things like that had to happen on specific moments of the song, otherwise the camera wouldn’t be in the right position to capture the thing. So that was again, multiple departments coming together, that stunts and effects and everyone in rehearsals and weeks of rehearsals and getting that all sorted. And I think it also goes back to there was nothing added to the rig to make that be able to happen. We already had a timecode feed on the rig. We already had 32 available analogue outputs, which was double what we thought we needed at the beginning. So, it also goes back to those beginning conversations with Simon and sitting down and going, “Please tell me everything that you’ve learned in 30 years, you know, PSM-ing and doing musicals.” So those six different things are kind of how we recorded those edits and those changes.
Jason Nicholas
Was there also a master word clock? You have all these digital devices with Dante and MADI and the RME…
Josh Winslade
Yeah, Dante was the master. Dante was the master and then Simon with the Scorpio and the Prodigy was able to read that. My RME [Babyface Pro] then read that and fed that to Pro Tools. And that was the best way to do it; also because the clocking on Simon’s Scorpio is where the audio was being recorded. And we did, me and Robin, you know, we had two days of routing. And because Simon has Dante elements to his rig as well that are completely separate.
Simon Hayes
Can I just interrupt at that point? It’s really important, guys. We’ve name checked a lot of people here. But another one of my first assistants is Robin Johnson, who’s been with me since 1998. And basically, he is the brains of the bunch. He’s the technical glue that holds me and Arthur together.
Josh Winslade
I’ve seen him [Robin] performing surgery on a Super C-mit [Schoeps microphone] on set…
Simon Hayes
When we talk about this whole rig and the Pro Tools rig, which I think Josh, it took Josh 30 minutes to talk about at the beginning of this podcast, it would be, “I don’t want to take credit for that”. I want to take credit for being able to say, “I’ve got a guy who can build this and can creatively work out what we need technically”. And that’s Robin Johnson. If there was ever a minute issue between all of these pieces of equipment talking to each other. It wasn’t Josh that sorted it out. It wasn’t me that sorted it out. It was Robin Johnson. He is an incredibly skilled individual and literally I wouldn’t be able to do this without him. That covers it, Josh, doesn’t it? I mean, I cannot speak highly enough about the understanding technically from a physics perspective of what Robin brings to the team. His knowledge is just phenomenal and even, you know, so… He’s qualified as a scientist.
Josh Winslade
Yeah, he is. But it’s everything and the things you wouldn’t expect him to have knowledge in as well. So, you know, Dante Networks and MADI to Dante Conversion and everything like that, I can completely understand. But then he gets into the world of RF [radio frequency – generic term for wireless audio]. For example, the way Wicked was kind of split up was if you divide it into quarters, the first quarter was all on stages, the second two quarters, the middle two, was all in massive fields, massive sets. And then the last quarter we were back in the studio, bar a couple of reshoots we popped out to Ivinghoe for. So, we knew this was coming up, this massive challenge RF-wise. And me, as someone relatively new to this team, watching this team who have 30-years of experience working together, they’re not worried, they’ve seen it all, they know exactly what they’re going to do. Robin had an idea, he got a two-hour pre-call the day before just to finalize it and set it all up. We’d done four hours overtime the night before building it, getting it up there, but a two hour pre-call, just in case anything wasn’t quite right. But this rig was phenomenal. It was two 18-feed. I think we had 18, 20 channels, I can’t quite remember Simon, apologies, but around 20 feeds of analogue audio cable drums, 250 metre drums, that were hoisted. I mean, getting them up the scaffolding these drums and then dropping them and cable tying them took 40 minutes. And then at the other end, the system that Robin had to come up with to amalgamate all the receivers, the crew comms, me, my crew comms, the RF and the sends for the wireless IEMs that Ari and Cynthia were listening to; and then bring that all together, get it down these lines, separate it all at the other end, it was just crazy – and he just did it. It never stopped working. There was never any faff, there was never any time lost to figuring it out, it was just done because genuinely these boys have seen and done a lot in their career, and it was absolutely amazing to see and watch that happen. And I just remember helping and building this rig with them just thinking, “This is phenomenal. I’ve never seen this many channels off-boarded for this long an analogue rig.” In my head, it’s just like, “One MADI cable, that’s what you want, to do it that way.” But the technology wasn’t quite there yet to have done it that way, the way that Simon wanted to do it. And so, they went back to what they know and what they know how to do very well, which was this off-board rig and it was so cool; and what it gave me and Simon, who were next to each other off the set, we never moved once – and I can’t tell you how many times there were 20-minute long, 30-minute long delays because an E-Z Up was being moved on set, because something was in shot, and during that time we’re talking, planning, prepping; we’re talking about how to get the next shot better. We’re not moving, we’re not stressing, we’re not worrying because of this rig and because they’ve seen it before and because they knew this was the best way to do that. Get this thing high up in the middle of the whole set and it never failed. It didn’t fail once. The range on it was absolutely phenomenal. Everything came down these drums to Simon’s sound van. My sound van’s right next to Simon. One MADI cable between me and Simon and that was it, we didn’t move for weeks on end.
Simon Hayes
And what that, let me interrupt you just because this is really, really important guys who are listening, who potentially are gonna come into these types of films later on in their careers. If I’m having to think technically for weeks on end, it means that my brain couldn’t think creatively, okay? If I’m presented with technical problems on a day-to-day basis and having to think technically, I’m just spending my time keeping up with the shooting process. I’m not spending my time thinking about the bigger picture and the film that we’re making and the performances that we’re recording and what I’m supplying to sound post. And, you know, I don’t want to have my mind filled up with technical issues. And I don’t mean problems. I’m talking about having to think about the technical side of production sound mixing. I don’t want to have any time thinking about that. That should just be a given. Eric Clapton doesn’t have to think about how his rig is cabled up when he goes on to stage and plays. He’s only thinking about the song that he’s playing, OK, and it’s no different for us guys. Any time that we’re spending thinking about the technical endeavour is time that we cannot be thinking about the creative endeavour, and that’s a mistake…
Neil Hillman
That’s part one of our special episode on the technical aspects of creating the soundtrack for Wicked, with the movie’s on-set music editor, Josh Winslade, and production sound mixer, Simon Hayes. In part two, we turn our attention to the challenges Simon faced in capturing the live vocals of Cynthia Erivo and Ariana Grande, and how he set about ensuring that his complicated technical arrangements meant that the artists were free and empowered to deliver the stunning performances that audiences around the world are now enjoying in cinemas. Look out for part two dropping into your favourite podcast platform such as Apple Podcasts, Spotify or Amazon Music, and don’t forget that the very best way to help us build our offering, and continue to attract fabulous guests, is to subscribe to our Apple and Biscuit Show YouTube channel. It will cost you nothing, no data is captured, but becoming a YouTube subscriber to the Apple and Biscuit Show means the world to us. And lastly, as ever, thanks for listening.
Announcer Rosie
The Apple and Biscuit Show is written, produced and presented by Jason Nicholas and Dr. Neil Hillman. It is edited and mixed by Jason Nicholas in our Sydney studio.

Leave a Reply

Dr. Neil Hillman MPSE

Brisbane,
QLD 4073,
Australia…

… And world-wide online.

I live and work on the lands of the Aboriginal and Torres Strait Islander Peoples and I recognise them as the Traditional Custodians of this country.

T: +61 (0)431 983 262
E: neil@drneilhillman.com