You’ve just finished a mix. It sounds great. You export it, take a break, and come back an hour later to listen with fresh ears. Now you’re not so sure. Is this version better than the one from yesterday? Should you have kept the brighter EQ on the vocals? Was the bass punchier before you added that compression?
Without a systematic way to compare, you’re stuck guessing. You flip between versions, but one moment you prefer version A, the next moment version B sounds better. Your ears are tired, your perspective is shot, and you have no idea which mix to deliver.
TL;DR
- Level matching is critical: Louder mixes always sound "better"—match volumes before comparing
- Use reference tracks: Compare your mix to professionally mastered songs in your genre
- Blind testing removes bias: Hide which version is which to make objective choices
- Focus on key elements: Compare vocal clarity, low-end weight, drum punch, stereo width
- Use dedicated tools: Plugins like Metric AB, Reference, and MCompare automate level matching and switching
- Test on multiple systems: Studio monitors, headphones, car, phone—mixes should translate everywhere
- Take breaks: Ear fatigue kills objectivity—compare in short sessions with rest periods
- Version control matters: Save and label every mix iteration for easy A/B comparison later
Here’s the thing: professional mixers don’t rely on memory or vague impressions. They use A/B testing—systematic comparison between two versions of a mix, or between their mix and reference tracks—to make objective decisions. This removes guesswork, reveals problems you didn’t know existed, and ensures your mix translates across different playback systems.
In this guide, I’ll break down the tools, workflows, and best practices that pros use to A/B test audio mixes. Whether you’re comparing your rough mix to a polished reference track or choosing between three different vocal treatments, you’ll learn how to make confident, data-informed decisions.
What is A/B Testing in Audio Mixing?
A/B testing in music production means comparing two (or more) versions of audio by rapidly switching between them to identify differences. The goal is to hear subtle changes that you’d miss if you listened to versions separately.
Common A/B testing scenarios:
- Your mix vs. reference track: Compare your work-in-progress to a professionally mastered song to check tonal balance, loudness, and overall vibe
- Mix version comparison: Test different mix decisions—version A with brighter vocals vs. version B with darker, more intimate vocals
- Before/after plugin testing: Bypass and re-engage an EQ or compressor to verify it’s actually improving the sound
- Stem comparison: Compare stereo mix to individual stems (drums, bass, vocals) to check balance
- Format comparison: Test lossy (MP3, AAC) vs. lossless (WAV, FLAC) exports
The human ear adapts quickly. After 30 seconds of listening to version A, that becomes your new "normal." When you switch to version B, even small differences jump out. This is why A/B testing is more revealing than listening to versions separately with hours or days in between.
Why A/B Testing Matters (And Why Your Ears Lie)
Your ears and brain play tricks on you. Understanding these biases is the first step to objective mixing.
The Loudness Bias Problem
Louder always sounds better. This isn’t subjective—it’s how human hearing works. When you compare two mixes and one is even 0.5 dB louder, your brain perceives the louder version as clearer, punchier, more exciting, and more professional.
This is the single biggest trap in audio comparison. You make a change (boost the highs, add compression, whatever), and it sounds better—but it’s just louder, not actually better. Without level matching, every comparison is corrupted by loudness bias.
Real-world example: You add an EQ to brighten your vocal. It sounds immediately better. But the EQ plugin also increased the output by 2 dB. You A/B the plugin on/off, and the "off" version sounds dull and lifeless. You conclude the EQ is essential. The truth? The brightness wasn’t the improvement—the volume was.
Professional A/B testing demands level-matched comparison. More on this below.
Ear Fatigue and Adaptation
Your ears adapt to whatever you’re listening to. After 60-90 minutes of mixing, you lose the ability to judge balance, tonal quality, or dynamics accurately. What sounded too bright an hour ago now sounds normal. What felt punchy earlier now feels weak.
A/B testing gives you a reality check. By comparing to a reference track or an earlier version, you hear how far your perspective has drifted.
Confirmation Bias
You spent an hour dialing in that compressor. Of course it sounds better—you invested time and effort. Your brain wants to believe the work was worth it. This is confirmation bias, and it kills objectivity.
Blind A/B testing removes this bias. If you don’t know which version has the compressor and which doesn’t, you make decisions based purely on what sounds better, not on what you think should sound better.
Essential Concept: Level Matching
Before you compare anything, match the perceived loudness of both versions. This is non-negotiable for accurate A/B testing.
How to Level Match Manually
Step 1: Measure peak and RMS levels: Use your DAW’s metering or a plugin like Youlean Loudness Meter to measure both versions. Note the LUFS (Loudness Units Full Scale) or RMS (Root Mean Square) values.
Step 2: Adjust to match: Lower the louder version’s output gain until both measure the same LUFS/RMS. Don’t boost the quieter one—always reduce the louder one to avoid clipping.
Step 3: Use quick switching: Set up your DAW so you can flip between versions instantly (hotkeys, routing, whatever works). The faster the switch, the easier it is to hear differences.
Step 4: Verify with sine wave test: Play a 1 kHz sine wave through both signal paths. If the volume is identical, you’ve matched successfully.
This manual process works but slows down workflow. That’s why dedicated A/B plugins exist.
Plugins That Auto-Level Match
Modern reference plugins handle level matching automatically. Here’s how they work:
You load your reference track into the plugin. The plugin analyzes its loudness, then applies gain reduction to match your current mix’s level (or vice versa). Now when you switch between your mix and the reference, both are at identical perceived loudness. Any difference you hear is tonal balance, dynamics, or spatial characteristics—not volume.
Popular tools with auto-level matching:
- Metric AB (ADPTR Audio) - Analyzes up to 16 reference tracks, auto-matches loudness, shows spectrum comparison
- Reference (Mastering The Mix) - Auto-level matching, visual frequency comparison, A/B switching
- MCompare (Melda Productions) - Crossfades between versions, auto-gain matching, supports multiple tracks
- AB Assist (NUGEN Audio) - Quick A/B with automatic gain compensation for comparing mix revisions
These tools save hours of manual level matching and make A/B testing part of your regular workflow instead of a chore.
The A/B Testing Workflow: Step-by-Step
Let’s walk through a practical A/B testing session from start to finish.
Step 1: Prepare Your Versions
For mix-to-reference comparison: Export your current mix as a WAV or AIFF file. Choose 2-3 reference tracks in your genre that represent the sound you’re aiming for.
For mix version comparison: Export both versions (before/after, option A/option B) as separate files. Name them clearly: "SongName_Mix_BrightVocal.wav" and "SongName_Mix_DarkVocal.wav."
For plugin testing: Set up your DAW so you can bypass the plugin with a hotkey. Make sure bypassing doesn’t change the gain (use the plugin’s output gain control to match levels).
Step 2: Set Up Your A/B Tool
Using a plugin: Insert your A/B reference plugin on your master channel (or a separate bus). Load your reference track(s) into the plugin. Let it analyze and auto-match levels.
Using DAW routing: Create a separate track for your reference. Route both your mix and the reference through a summing bus. Use mute buttons to switch between them. Add a gain plugin on the reference track to manually match levels.
Using Feedtracks or similar platforms: Upload both versions to the same project. Most audio collaboration tools let you play versions back-to-back or overlapped for instant A/B comparison without leaving the browser.
Step 3: Focus on Key Elements
Don’t try to hear everything at once. Compare one element at a time.
Vocal clarity: Is your lead vocal as clear and present as the reference? Does it sit forward without being harsh?
Low-end weight: Does your kick and bass have the same power and definition as the reference? Is the low end tight or boomy?
Drum punch: Do your drums hit with the same impact? Are the transients (the initial "smack" of each hit) as sharp?
Stereo width: Does your mix feel as wide? Are elements placed similarly in the stereo field?
Frequency balance: Use a spectrum analyzer to compare overall tonal balance. Is your mix darker, brighter, or mid-heavy compared to the reference?
Dynamic range: Is your mix as dynamic, or is it more compressed? Does it breathe, or does it feel squashed?
Step 4: Take Notes
Write down what you hear. Vague mental notes like "the reference sounds better" won’t help you mix.
Specific observations sound like this:
- "Reference vocal has more presence around 3-5 kHz"
- "My low end is muddier—too much energy in 100-200 Hz range"
- "Reference drums have sharper transients, mine sound softer"
- "My mix is narrower—guitars aren’t spread as wide"
These specific notes translate into actionable mix changes.
Step 5: Implement Changes and Retest
Based on your notes, adjust your mix. Boost vocal presence, cut low-mid mud, tighten compression on drums, widen guitars. Export a new version and repeat the A/B process.
This iterative loop—compare, note differences, adjust, retest—is how professional mixes get refined.
Best Practices for Accurate A/B Testing
Follow these guidelines to get the most out of your comparisons.
Match Levels Within 0.5 dB
Even 1 dB of difference introduces loudness bias. Use plugins with auto-gain matching, or manually adjust until levels are identical.
How to verify: Use a loudness meter plugin (Youlean, iZotope Insight, stock DAW meters) set to LUFS or RMS mode. Both versions should read within ±0.5 dB.
Switch Quickly (Under 2 Seconds)
The longer the gap between versions, the harder it is to hear differences. Set up instant switching via hotkeys, plugin controls, or mute buttons.
Your brain’s auditory memory is incredibly short. If it takes 10 seconds to switch from A to B, you’ve already forgotten the details of A.
Use Short Comparison Loops
Don’t compare entire songs. Loop a 10-15 second section that highlights the difference you’re testing.
Example: If you’re comparing vocal treatments, loop the first chorus where the vocal is most exposed. Listen to that section on repeat, switching between versions every 2-3 loops.
This focused approach reveals subtle differences that get lost when comparing full tracks.
Take Regular Breaks
After 15-20 minutes of A/B testing, your ears fatigue and your judgment degrades. Take a 10-15 minute break. Walk away, drink water, reset your ears.
When you come back, the differences will be obvious again.
Test on Multiple Systems
A/B testing on studio monitors is step one. But also test on:
- Headphones (closed-back and open-back if you have both)
- Car stereo (where many people listen to music)
- Phone speaker (the worst-case scenario—if it works here, it works everywhere)
- Laptop speakers (common consumer playback)
- Bluetooth speaker (another common consumer scenario)
If your mix holds up across all these systems, it translates well.
Use Mono Playback for Balance Checks
Sum your mix to mono and A/B it against the reference in mono. This reveals balance issues that stereo width can hide.
If your vocal disappears in mono but the reference vocal stays clear, you have a balance problem, not a stereo problem.
Document Your Findings
Keep a mixing journal or Google Doc where you log A/B test results. Over time, you’ll spot patterns: "I always boost vocals too much," or "My low end is consistently muddy compared to references."
These patterns inform your mixing instincts and help you avoid repeated mistakes.
Blind Testing: Removing Bias from Decisions
Blind testing means you don’t know which version is playing. This removes psychological bias and forces you to judge purely by sound.
How to Set Up a Blind Test
Using a plugin: Some A/B plugins (like Metric AB and MCompare) include a "blind" or "shuffle" mode. Load your versions, enable blind mode, and the plugin randomizes which version plays when you switch. You won’t know if you’re hearing A or B until you reveal the answer.
Manual method: Ask a friend or collaborator to set up two versions in your DAW without telling you which is which. Rename them "Version 1" and "Version 2." Listen, choose your favorite, then reveal which is which.
Export and randomize: Export both versions with randomized filenames ("Track_X.wav" and "Track_Y.wav"). Listen in a fresh DAW session where you don’t know which is which. Pick your favorite, then check your export folder to see which file corresponds to which mix decision.
Why Blind Testing Matters
You spent three hours adding compression to your drums. When you A/B with and without it, of course the compressed version sounds better—you invested time, and your brain wants that investment to pay off.
But in a blind test, you might discover the uncompressed version actually sounds more natural and punchy. The compression was solving a problem that didn’t exist.
Blind testing is humbling. It exposes when your assumptions are wrong and when "improvements" are actually making things worse.
When to Use Blind Testing
Choosing between mix versions: You have three different vocal EQ treatments. Blind test to pick the best one objectively.
Testing expensive plugins: Does that $200 compressor really sound better than the stock plugin? Blind test them.
Validating mastering changes: Is the mastered version actually better, or just louder? Level-match and blind test.
Settling creative disagreements: Bandmate says the guitar is too loud, you disagree. Blind test multiple levels and let the team vote without knowing which is which.
Tools for A/B Testing Audio Mixes
The right tools make A/B testing seamless. Here’s what professionals use.
DAW-Based A/B Testing
Logic Pro: Use the Bounce-in-Place feature to create alternate versions on separate tracks. Mute/unmute to switch. Add a Gain plugin to manually level-match.
Pro Tools: Duplicate your master track, apply different processing to each, and use the mute/solo buttons to switch. The "Compare" feature (Shift+Cmd+/) bypasses all plugins on a channel for instant before/after comparison.
Ableton Live: Create multiple return tracks or freeze alternate versions. Use Ableton’s built-in Utility plugin for gain adjustments to level-match.
FL Studio: Use the Mixer’s routing to create separate output buses for different versions. Switch between them using mute controls.
All modern DAWs support basic A/B testing, but dedicated plugins make it faster and more accurate.
A/B Reference Plugins
Metric AB (ADPTR Audio) - $149
The industry standard for professional mixing and mastering. Load up to 16 reference tracks, and Metric AB auto-matches loudness, displays spectrum analysis, and lets you switch between your mix and references instantly.
Why it’s great: Visual frequency comparison shows exactly where your mix differs from the reference. The stereo width analyzer reveals if your mix is too narrow or too wide. Level matching happens automatically in seconds.
Best for: Mixing and mastering engineers who need professional-grade analysis and comparison tools.
Reference (Mastering The Mix) - $99
Designed for speed and ease of use. Load a reference track, and Reference auto-matches loudness, displays EQ curves, and gives you visual feedback on how to adjust your mix.
Why it’s great: The visual EQ overlay shows "boost here, cut there" suggestions based on the reference. It’s faster to set up than Metric AB and more beginner-friendly.
Best for: Producers who want quick visual feedback on tonal balance without deep analysis.
MCompare (Melda Productions) - Free
A free A/B comparison plugin that crossfades between up to 8 tracks. Auto-gain matching, spectrum analysis, and simple controls.
Why it’s great: It’s free. For budget-conscious producers, this is a no-brainer starting point.
Best for: Beginners and home producers who need basic A/B functionality without spending money.
AB Assist (NUGEN Audio) - $149
Focused on comparing mix revisions rather than reference tracks. Quickly switch between two versions of your mix with automatic gain compensation.
Why it’s great: Designed specifically for the "before/after" workflow. Load two exports, and AB Assist handles level matching and instant switching.
Best for: Mix engineers juggling multiple revisions for clients who want to compare versions efficiently.
Cloud-Based A/B Testing (Feedtracks, Notetracks)
Cloud collaboration platforms let you upload multiple mix versions and A/B them directly in the browser—no plugins required.
How it works: Upload "Mix_v1.wav" and "Mix_v2.wav" to the same project. The platform displays both versions on a timeline, and you can switch between them or play them back-to-back. Some platforms (like Feedtracks) include waveform overlay comparison, so you can see visual differences between versions.
Why this matters: When working with clients or collaborators remotely, you can share a link and let them A/B versions without needing DAW access. They listen, choose their favorite, leave timestamped comments, and you get clear feedback.
Best for: Remote collaboration where multiple stakeholders need to compare and provide feedback on different mix versions.
Free A/B Testing Options
DAW stock tools: Use your DAW’s mute/solo, bypass, and gain controls for manual A/B testing. It’s slower than dedicated plugins, but it’s free and always available.
Audacity (free, open-source): Load two versions on separate tracks, level-match manually using the Amplify effect, and use solo/mute to switch.
MCompare (free Melda plugin): Mentioned above—fully functional A/B plugin at no cost.
Lacinato ABX (free, cross-platform): Software for blind ABX testing. Load two files, and the software presents them to you in randomized order. You choose which sounds better, and it reveals the answer after.
Common A/B Testing Scenarios
Comparing Your Mix to a Reference Track: Match the tonal balance, loudness, and overall vibe of a professionally mastered song in your genre. Choose 2-3 reference tracks, load them into an A/B plugin (Metric AB, Reference, MCompare), let the plugin auto-match loudness, loop the chorus, and switch back and forth noting differences in vocal level, low-end weight, drum punch, and stereo width.
Choosing Between Mix Versions: Decide which of two (or more) mix decisions sounds best. Export each version as a separate file, load all into an A/B plugin, level-match, loop a key section (usually the chorus), and choose the winner based on which feels most natural and emotionally impactful.
Before/After Plugin Testing: Verify that a plugin is actually improving the sound, not just making it louder. Insert the plugin, make adjustments, use the plugin’s output gain control to match the bypassed level, set up a hotkey to bypass, loop a section, and toggle bypass on/off every few seconds. If you can’t hear a clear improvement when levels are matched, remove the plugin.
Mix Revisions for Clients: Compare your latest revision to the previous version and verify that client feedback has been addressed. Export the new revision, load both versions into an A/B tool, level-match, loop the sections where changes were made, and A/B the before and after to confirm the change is noticeable and correct.
Testing Lossy Export Formats: Verify that your MP3 or AAC export sounds acceptable compared to the lossless WAV/FLAC master. Export a WAV and an MP3 (320 kbps), load both into an A/B plugin, level-match, and A/B across different sections listening for artifacts like loss of high-end air, smearing of transients, or phase issues.
What to Listen For During A/B Testing
Knowing what to focus on makes A/B testing more effective. Here’s a checklist of key elements.
Tonal Balance (Frequency Response)
Low End (20-200 Hz): Is the bass and kick drum weight similar? Is it tight and defined, or boomy and muddy?
Low Mids (200-500 Hz): Does the warmth and body match? Too much here sounds boxy, too little sounds thin.
Mids (500 Hz - 2 kHz): Where vocals, guitars, and most melodic elements live. Does the clarity match?
High Mids (2-5 kHz): Presence and intelligibility. Does the vocal cut through? Are consonants clear?
Highs (5-20 kHz): Air, sparkle, and space. Does your mix have the same shimmer as the reference, or is it darker/brighter?
Use a spectrum analyzer plugin to visualize frequency balance during A/B testing. If your mix has a 5 dB bump at 3 kHz and the reference doesn’t, that’s a clue.
Dynamics and Compression
Punch vs. sustain: Does your mix hit as hard, or is it more compressed and squashed?
Breathing room: Does the mix feel dynamic and alive, or flat and lifeless?
Transients: Are drum hits, guitar plucks, and vocal consonants as sharp and defined?
Over-compression makes mixes sound loud and dense but removes dynamics and energy. Under-compression makes mixes feel uncontrolled and inconsistent.
Stereo Width and Imaging
Width: Is your mix as wide as the reference, or narrower? Are guitars, synths, and backing vocals spread similarly?
Mono compatibility: Sum both mixes to mono and A/B. Does your mix hold up, or do elements disappear due to phase issues?
Center focus: Are the key elements (vocal, kick, bass, snare) anchored in the center with the same weight as the reference?
Use a stereo imaging plugin or goniometer to visualize width during A/B testing.
Loudness and Perceived Energy
Peak loudness: Measured in dBFS—how close to 0 dB are the loudest moments?
Average loudness: Measured in LUFS—what’s the overall perceived volume?
Subjective energy: Does your mix feel as exciting, impactful, and engaging?
Remember: loudness bias means you must level-match before comparing. Otherwise, you’re just comparing volume, not quality.
Clarity and Separation
Vocal intelligibility: Can you understand every word as clearly as in the reference?
Instrument separation: Can you hear individual elements distinctly, or does everything blur together?
Frequency masking: Are elements fighting for the same space (e.g., guitar and vocal both occupying 2-4 kHz)?
A cluttered mix with poor separation sounds amateurish. A clear, well-separated mix sounds professional.
How Feedtracks Supports Mix Comparison
While plugins handle in-DAW A/B testing, cloud platforms like Feedtracks streamline version comparison and client collaboration.
Upload multiple versions to one project: Instead of emailing "Mix_v1.wav" and "Mix_v2.wav" as separate attachments, upload both to a Feedtracks project. Clients and collaborators can A/B them directly in the browser.
Waveform overlay comparison: See visual differences between versions. If you boosted the drums in v2, the waveform peaks will be visibly higher. This helps collaborators understand what changed without needing technical explanations.
Timestamped feedback on specific versions: Clients can leave comments like "At 1:45 in v2, vocal is perfect now" or "v1 had better low end at 0:32." This version-specific feedback prevents confusion about which iteration you’re discussing.
Permanent version history: Unlike Dropbox’s 30-day history or Google Drive’s limited versioning, Feedtracks keeps every version permanently. Six months later, you can still pull up "Mix_R01" to compare against the final master.
Collaborative decision-making: When multiple stakeholders (artist, producer, label rep) need to weigh in on which mix version to use, they can all listen, A/B, and leave feedback in one place instead of separate email threads.
This workflow complements in-DAW A/B testing: use plugins for detailed technical comparisons while mixing, use Feedtracks for client-facing version comparisons and feedback collection.
Common A/B Testing Mistakes
Not Level Matching: The #1 mistake. If you don’t match levels, you’re comparing volume, not quality. Always use auto-gain matching plugins or manually adjust until levels are within 0.5 dB.
Comparing Too Many Versions at Once: A/B testing works. A/B/C/D/E/F testing creates confusion. Limit comparisons to 2-3 versions max. If you have more options, do multiple rounds: A vs. B, then winner vs. C, etc.
Testing While Fatigued: Your ears lie when they’re tired. If you’ve been mixing for 90 minutes straight, take a break before A/B testing. Fresh ears hear differences clearly; fatigued ears hear mush.
Focusing on the Wrong Sections: Don’t A/B the intro when the issue is in the chorus. Loop the section where the difference matters most.
Ignoring Mono Compatibility: A mix that sounds wide and lush in stereo but falls apart in mono has phase problems. Always A/B in mono to check.
Over-Relying on Visual Analysis: Spectrum analyzers and meters are useful, but don’t mix with your eyes. If the analyzer says the mixes match but your ears hear a difference, trust your ears.
Not Documenting Results: You A/B test, make a decision, move on—then forget why you chose version B over version A. Write it down. Next time you face a similar choice, you’ll have a reference.
Conclusion: Make Better Mixing Decisions with A/B Testing
A/B testing removes guesswork from mixing. Instead of wondering if your vocal is too loud or your low end is too muddy, you compare your mix to professional references and hear exactly where the gaps are. Instead of hoping version B is better than version A, you level-match, blind test, and choose objectively.
The workflow is simple: level-match, switch quickly, focus on specific elements, take breaks, and document your findings. The tools—from free options like MCompare to professional plugins like Metric AB—automate the tedious parts and let you focus on listening.
Start small. Choose one reference track in your genre, load it into a free A/B plugin, and compare your next mix. Loop the chorus. Listen to the vocal. Listen to the low end. Listen to the stereo width. Note the differences. Adjust your mix to close the gap. Re-export and retest.
This iterative process—compare, adjust, retest—is how professional mixes get refined. It’s not glamorous, but it works. Over time, you’ll internalize the tonal balance, dynamics, and spatial characteristics of great mixes. Your ears will improve. Your instincts will sharpen. And your mixes will sound closer to the references you admire.
A/B testing isn’t about copying references exactly—it’s about understanding what makes them work and applying those principles to your own creative vision. Use it as a learning tool, a quality check, and a reality filter for when your ears drift.
Your next mix doesn’t have to be a guessing game. Test it. Compare it. Refine it. Trust the process.
Compare Mix Versions with Ease
Feedtracks makes version comparison simple—upload multiple mixes, A/B them in-browser, collect timestamped feedback, and keep every version forever. No plugins required.
Try Feedtracks Free →Frequently Asked Questions
What’s the difference between A/B testing and referencing?
A/B testing is the general technique of comparing two versions of audio by rapidly switching between them. Referencing specifically means comparing your mix to a professional reference track. Referencing is one type of A/B testing.
Do I need expensive plugins for A/B testing?
No. You can A/B test using only your DAW’s built-in tools (mute buttons, gain plugins, bypass controls). Dedicated plugins like Metric AB and Reference make the process faster and more accurate with auto-level matching and visual analysis, but they’re not required to get started.
How do I know if my A/B comparison is accurate?
Level-match within 0.5 dB using a loudness meter, switch between versions in under 2 seconds, and test in short loops (10-15 seconds). If you follow these rules, your comparisons will be accurate.
Should I A/B test every mix decision?
For major decisions (vocal level, overall tonal balance, compression settings), yes. For tiny tweaks (0.5 dB adjustments on background elements), it’s overkill. A/B test the changes that matter most.
What’s the best reference track to use?
Choose a professionally mixed and mastered song in your genre that has a similar vibe and instrumentation to your track. Ideally, use 2-3 references to avoid copying one specific sound too closely.
Why does my mix sound worse after comparing to a reference?
References are professionally mixed and mastered, often by world-class engineers in expensive studios. Your mix won’t match them immediately—that’s normal. Use references to identify gaps and improve incrementally, not to feel discouraged.
Can I A/B test in headphones, or do I need studio monitors?
Both work. Studio monitors give you a more accurate representation of how your mix will sound on speakers, but headphones are fine for A/B testing. Ideally, test on both (and on car speakers, phone speakers, etc.) to ensure your mix translates everywhere.
Related Articles
- Audio Review Tools for Music Producers: 2025 Comparison
- Version Control for Audio Projects: Never Lose a Mix Again
- How to Give Effective Feedback on Music Mixes
- Mixing Workflow: Step-by-Step Process for Beginners
About the Author: The Feedtracks team helps audio professionals streamline their mixing and collaboration workflows with version comparison, timestamped feedback, and permanent storage.
Last Updated: March 2026