What’s the same between mixing vs. mastering?
Those definitions might make mixing and mastering sound similar—and they are.
Some of the tools and techniques used in both are related. Mastering is done using many of the same kinds of processing that you apply in your mix.
Mixing and mastering are both done by engineers to help turn raw tracks into a satisfying recording.
The differences may seem subtle, but they’re important when it comes to the role that each step plays in the lifecycle of a song.
What are the differences between mixing and mastering?
With the similarities out of the way, let’s dive into what sets mastering apart from mixing.
Here are five reasons mixing and mastering should be considered completely separate processes:
1. Mastering processes a stereo mix
When you’re mixing a song you have access to all the recorded tracks individually.
That means you’re free to turn them up or down, change the pan position or even remove them entirely from the mix.
Mastering is only ever applied to a finished stereo mix.
That means the mastering engineer has no control of the contents of the mix. They can only affect the entire stereo track.
That’s one of the reasons mastering processors typically create much more transparent changes to the sound.
2. Mastering mostly makes subtle changes
During the mix, it’s not unusual to make significant changes to a sound for it to fit alongside your other elements.
Sometimes you might high pass most of a sound’s low end to make room to mix your kick and bass.
Or maybe you apply intense compression for a stylish squashing effect.
Mastering takes the complete opposite approach. Compression and EQ are still used, but mastering engineers almost never make broad, sweeping changes to the sound.
Instead, the first rule of mastering is “do no harm.” Any changes the frequency balance or dynamics of the song will be transparent.
That means that at times you might not even notice a big difference in the basic tone of your track after mastering—but that’s a sign that you’ve done a good job mixing your song!
3. Mastering uses specialized tools
The plugins or hardware processors used in mastering have familiar categories like EQ and compression, but they’re pretty different from the kind you mix with in your DAW.
Mastering EQs and compressors are designed for ultra-high performance and transparency.
Think about it. If you’re affecting every single sound in your mix with an EQ, it better do its job as well as it possibly can!
Mastering hardware is some of the most expensive gear you can imagine. In some cases a single stereo channel can cost well over $10000!
And that’s before you even factor in acoustic treatment and high end studio monitors.
Experienced engineers will tell you that the environment where you record and mix is just as important as the gear you use—if not more.
That goes double for mastering. Mastering studios spend enormous amounts of money to make sure their listening environment is as close to neutral as it can possibly be.
4. Mastering focuses on consistency and presentation
The main changes that happen in mastering are to make your song sounds more consistent across different playback systems and make sure it fits in alongside other tracks in a commercial library.
Listeners hear music in more diverse listening environments than ever before.
Your song has to sound as good as it does in your home studio anywhere your fans hear it.
That means it needs to be optimized for playback on everything from a pair of earbuds to an enormous club sound system.
It’s harder than it seems to achieve that balance, but that’s where mastering comes in.
One of the most critical tasks in mastering is to create a compelling balance of frequencies where nothing stands out—no matter what playback device you’re listening on.
It takes a good ear and a lot of expertise to get it just right.
5. Mastering delivers technical standards
All the music you consume on a platform like Spotify or Apple Music has to meet strict technical standards.
The most important of these are the targets for loudness and dynamic range.
Loudness seems simple—it’s just how loud something is right? Unfortunately, when you’re dealing with audio signals, it’s not very straightforward.
The way we experience loudness depends as much on our perception and cognition as the intensity of sound waves in the air.
For example, sounds can seem louder or quieter depending on their frequency.
That’s why the audio industry has developed a handful of measurement standards to ensure that mastering engineers are all aiming for the same target.
These standards are difficult to understand and require special plugins to meter effectively.
Mastering engineers use them to create the ideal balance between dynamic range, headroom and perceived loudness.
Keeping all that in mind while still processing the track in a musical way is a pretty big challenge!