Read all the instalments of Ask Abbey Road here.
Ask Abbey Road: Recordist Matt Jones on reverb, sidechaining and where to spend your budget
Abbey Road Studios recordist Matt Jones discusses his go-to reverbs, how to configure reverbs in a session and sidechaining.
This is the second of a three-part interview with Abbey Road recordist Matt Jones. In the first part, Jones shared details of a recordist’s job scope, the importance of having a checklist and tips on landing a job in the industry.
Jones began working at Abbey Road in 2012 straight out of university, rising through the ranks from runner to an assistant and now to recordist. Here, he discusses his go-to reverbs, how to configure reverbs in a session and explains sidechaining.
George: If an orchestral scoring project has a limited budget for recording, which instruments would you record live over others?
Matt Jones: This is a tricky one, because as soon as you start adding real elements to a piece of music that’s otherwise built around samples, you can really begin to show the fakeness of the sampled instruments versus the real ones. So, you have to be quite careful if you’re just going to replace some parts.
I would say that something you can almost always get away with using samples for would be any percussion elements within a score. Having the real thing is fantastic and we get to do it quite a lot at Abbey Road. It does add a lot to the score if you can afford it, but there are so many great samples out there that are fine if you can’t. Probably the next few instruments that would fall into that category would be pianos, harps and any keyboards – they’re slightly more percussive instruments, anyway.
I’d say, if I knew I only had a minimal budget, any solo instruments that feature in the score would be the place to start recording live. That’s just to get the character and uniqueness of a real person playing the solo you’ve written. That will make it stand out and give it some character.
There are a surprising number of parallels between metal music and film music
James: How do you find your background in brass and metal helps you during recording sessions?
MJ: I think having a background in playing any kind of music helps you. I don’t think that necessarily my brass and metal helps me more or less than any other instruments. Being able to read music and follow a score is a necessity for the orchestral sessions that we do.
Having been recorded in studios myself is massively helpful, because it’s a way of being able to see things from both sides of the glass. It’s great to have an appreciation of what you can do as an engineer to bring the best performance out of a musician, whether that’s adjusting a headphone mix or giving useful instructions over talkback.
More specifically, brass was my introduction to music theory and music, generally. So it’s served me well, even though I didn’t really play in orchestras, I played in brass bands. That’s a great introduction to knowing how sections of musicians operate together. As for the metal side of things, I’m still waiting to do a proper metal session at Abbey Road – one day!
There are a surprising number of parallels between metal music and film music, I’ve always thought. Especially the big, epic stuff. In terms of having a lot of elements and having to tame them and fit them into the mix – it’s inspiration, if nothing else. You listen to some modern metal records, and the production on that stuff is insane.
It’s definitely good to be familiar with a wide range of styles and types of music. In this job, you need to know what people mean when they say they want to sound like a particular artist, or they want their guitars to sound like the guitarist from a specific genre of music. You need to be able to make broad changes to a sound sometimes to quickly get in the ballpark of what they’re going for so they can feel comfortable and get the results that they want.
MusicTech: Can you think of an example of when you’ve had to dramatically change a sound?
MJ: Sometimes it can happen if you’re doing ensemble recording with a string section or something and you set up assuming you want quite a lush, reasonably ambient film-esque sound – and it quickly becomes apparent that actually, you need to get in a lot closer to the musicians and you want to hear the scratching of the strings. You want everything to be a bit more close and poppy and a bit less polished. That could just be a case of moving all the microphones quickly, or making them a few feet lower.
MT: Do you have any other tips for getting the best performance out of an artist?
MJ: It’s mainly making sure they are comfortable with what they are hearing – making sure they’ve got enough click if they need click, or the right bit of the drum kit. Or just making sure they got a nice cup of tea or glass of water, or whatever else they might need. That’s if it’s a solo or band situation. Obviously, I’m not going to make 100 people in an orchestra a cup of tea, but if a singer wants a honey and lemon, that’s no problem. I’ll always make someone a cup of tea.
I’m a big fan of Soundtoys Little Plate, because you load it up and it sounds good and there’s only one knob to adjust
Jason: What’s your philosophy for using reverb in a mix? How do you configure reverbs in a session?
MJ: The way you use reverb will differ a reasonable amount depending on the style of music that you’re mixing. Also, in the method of delivery – how you’re going to pass that on to the next person in the chain. Saying that, even if you’re not delivering stems, I often find it easier to work as though you are. So I’ll typically have all my tracks laid out in chunks within my session, so I have all my drum tracks at the top and then I have the auxiliaries below that chunk of tracks, where things like reverbs and delays that apply to those tracks will live.
Below that, you have the bass and any auxes for that and then the guitars, and then the strings, etc, etc, then keys.
It’s not for every instrument, but each group of instruments that I would call a stem. For example, all the guitars will probably share a reverb. All the keys will probably share a reverb.
I’ll have a reverb or reverbs for each of those stems. They might not differ that much in their settings between each one. The keyboard might share the same settings as the guitar reverbs, but they will have their own instance of the plug-in and their own aux to go through for it. That way, once you send everything down to your sub-mixes or your stems at the bottom of the session, you know that everything’s clean. So, if you do get asked for a drum stem, you’re not going to print it and then suddenly get piano reverb all over your drums or things like that. I just find that a more flexible way to work.
As far as the reverbs themselves, I’ll often put an EQ before the reverb. Typically just high- and low-pass to shape the input to the reverb in ways that can be helpful. I find it useful to tame excessive presence or sibilance in a vocal-reverb send, for example. That’s because it can trigger reverbs in annoying ways, so if you can get rid of that before it hits the reverb, it can help the reverb sit behind the source material that’s feeding it a bit nicer, making it a bit easier to blend it in.
This will be after any plug-ins on the vocal. You might want the vocal sounds to be quite bright and upfront, but you don’t necessarily want that brightness to be setting off the reverb too much. So sometimes it’s nice to just low-pass some of that and also high-passing drum reverb to get rid of any of that woofy build-up that can be annoying.
It’s often quite nice to have two reverbs going on, especially for drums and vocals. I’m a fan of having a fairly short plate reverb to give that initial illusion of air around a vocal and then a longer room or plate underneath that to provide that richness.
If the music allows, it can be fun to play around and be a bit more creative with reverb. Like having a sweeping, long reverb into a ping-pong-type delay for some movement, or by distorting the sned to the reverb, or trying to find an extreme reverb where the decay works with the tempo of the song. Of course, it’s all too easy to get carried away once you start doing things like that and you’ll find that a lot of music doesn’t really have the space for that kind of fun. But, every now and then, you’ll get a track that you can be a bit more creative with, which I always enjoy.
I often find it’s cool to sidechain sounds from a source that doesn’t already exist in a song
MT: What are your go-to reverbs?
MJ: For the plate side of things, I’m a big fan of Soundtoys Little Plate, because you load it up and it sounds good and there’s only one knob to adjust. So, that’s very quick to get yourself up and running. But if you want a more detailed plate, then the Waves Abbey Road plates are excellent, and you can adjust those a bit more to get a more tailored sound. As far as rooms go, the Valhalla stuff is great. That’s most of what I use day to day.
MT: Do you ever use reverb on orchestral material?
MJ: Definitely. We’ve got a luxury at Abbey Road in that things that were recorded in Studio One already sound quite big and reverberant, if you’ve mic’d it that way. But it’s always nice to have a little bit more on hand. Then, obviously, if you’ve recorded an orchestra in a smaller space, you’ll need to rely on reverbs to give you that lushness and sense of space.
Nihal: What is a sidechain? And where do you find sidechaining most useful?
MJ: A sidechain is an effect that you have on one sound that is triggered by the level of another sound. The classic example is when you have a compressor on a bass track and you set it so it ducks the level of the bass whenever a kick drum hits. This is so the kick can poke through for that moment when they’re sharing a lot of the same frequency space. So, rather than EQing to remove some of those frequencies from the bass, it can be nice just to duck it a little bit. It’s also a great way to get a track pumping if you use it in a more extreme way.
Having synths or pads – elements of a track with more sustain – being quite heavily compressed, but dependent on other elements in a track, is an excellent way to add a movement and that swelling effect that’s sometimes appropriate.
I often find it’s cool to sidechain sounds from a source that doesn’t already exist in a song. I was mixing a track recently that I recorded for BBC Introducing with Olivia Nelson where all the low end comes from this Moog synth – and it’s a really hefty mono sound and it works perfectly in the track, but it needed some kind of interest and movement. So I tried sidechaining it from the kick, because it was easy, but it didn’t really give us the shape that we wanted.
So, by setting up a new audio track – I think I just took one instance of a kick hit – and then we can copy it and paste it around, with clip gain up and down for accents. Ultimately, we reversed it and had it slightly off the beat. You end up with this loop on an audio track that would sound utterly ridiculous if you ever actually listened to it in the track. But if you send that to the key input of your bass compressor, it’s really straightforward to adjust the swells and the movement in the bass. Plus, it’s a bit more interesting than just ducking a bit every time the kick-drum hits.
MT: Any other sidechain trickery that’s been a success?
MJ: I think sidechain compression on reverbs is often quite fun. If you have a really long reverb that produces a pad-type sound, having those pumping a little bit can be fun sometimes. It creates an interesting texture.