MQA: A Review of controversies, concerns, and cautions

MQA: A Review of controversies, concerns, and cautions

Editor’s Note 1: MQA ltd was sent a copy of this article several days prior to the scheduled publication date. The company requested a phone conversation, which took place earlier this week. MQA was encouraged to write a response for inclusion with the article below, but it respectfully decline to submit a formal response.

 

Editor’s Note 2: The author of this article is writing under a pseudonym. While he is unknown to the readers, his identity has been verified  by Audiophile Style. He has no vested interest in the audio business, other than being a consumer of music.

 

Editor’s Note 3: The technical assertions made in this article have been thoroughly checked by independent engineers, both in and out of the audio industry. To the best of our knowledge everything technical in this article is factually correct and may be duplicated at any time by anyone with the requisite skills.

 

– Chris Connaker

 

 

 

 

MQA: A Review of controversies, concerns, and cautions.
February 25, 2018
Archimago, for Computer Audiophile

 

 

 

“Controversy is only dreaded by the advocates of error.”

– Benjamin Rush

 

 

 

I want to thank Chris for reaching out and giving me the opportunity to post an article on Computer Audiophile about MQA. As you perhaps are aware, over the last 3 years, I’ve been posting various findings and impressions about MQA in my blog, Archimago’s Musings. Furthermore, I appreciate Chris’ willingness to allow me to post this under my pseudonym @Archimago. I know there are perceived issues with anonymous postings, I commented on this in the forum here if you want to read more about my rationale.

If you have read my writings on MQA, obviously, I have been a critic and have expressed on a number of occasions some problems I see with this data format and overall “system” of audio playback. However, I believe I have been reasonably diplomatic in using proper etiquette while expressing concerns and criticisms. In my opinion, we can certainly examine the merits and failings of a data format without the need to get hysterical or personal. I hope you will find my tone in this article to be reasonable.

My intent for this article is to provide a relatively broad but detailed overview. When appropriate, I will include links in the body of this article and footnotes below for further reading. I will embed a few images for reference realizing that charts and graphs can also be found elsewhere and perhaps in more detail. With the power of Internet search engines at our fingertips, numerous subjective opinions and results of objective tests are readily found elsewhere also. The core of what I’m interested in discussing in this essay is the simple question: “Why has controversy surrounded MQA to this extent?” By the end of this, I hope most readers will be essentially “caught up” with the discussions and debates surrounding MQA among audiophiles. As always, ultimately you decide whether you think MQA is worthwhile.

While the question above may be easy to ask, the answer is multifaceted and more difficult to express thoroughly given the complexity and nuances of a system with multiple parts incorporating a number of ideas. Considering the volume of back-and-forth arguing found here and elsewhere, it appears that MQA has touched a nerve at the core of the audiophile hobby.

Looking at the extent and the expense audiophiles go through to achieve high quality playback, we can say that the audiophile pursuit is one of trying to achieve an ideal; we even see the phrase “perfectionist audio” employed to describe this hobby. “We” seek the highest fidelity sound quality in audio playback and typically approach it with great passion (1). With MQA, the company must realize right from the start that they “threw their hat into the ring” to be debated and dissected when it was marketed to audiophiles through the mainstream audiophile press (2). Since then, at least every few months, MQA has featured prominently in the audiophile magazines with articles claiming significant audible benefits (3), and regularly (if not incessantly) mentioned in digital hardware reviews as a desirable new feature (4) in the last few years.

Although there are likely other factors involved, let us focus on three major areas of contention:

  1. MQA takes aim at a foundational level positing itself as a viable and “desirable” format.
  2. MQA tries to position itself as sounding “better” than what we currently have.
  3. MQA over-reaches the role of a traditional data format and aims to be a “philosophy” with DRM concerns.

 

Let us explore each of these in some detail.

 

 

1. A foundation for a “desirable” format? Was there a need?
As we near the end of the second decade into the 21st Century, most of us are familiar with multiple media distribution formats whether in the audio or video we consume. If we consider just the digital audio world, in the 1980’s, we were introduced to the CD, by the mid-1990’s audiophiles heard about HDCD if not owned a decoding device, by the turn of the century, SACD and DVD-A battled it out for high resolution dominance. While more SACD titles have been released, neither truly captured the public’s fancy. By the end of the 2000’s with the Blu-Ray format we have seen “audio only” Blu-Ray disks. Yet again, as with SACD and DVD-A, the physical high resolution formats have barely made a dent in the music marketplace.

While physical media for digital audio floundered over the last decade, with general-purpose computers, the consumer has learned to be “agnostic” about how the data itself is packaged. Since the early 2000’s, “computer audio” has seen massive growth among the public and among audiophiles. CDs can be “ripped” easily and perfectly, commodity computer hardware can be assembled to build multi-terabyte media servers, generations of DACs have been manufactured to allow high-resolution playback, and ubiquitous network technology has allowed streaming through the home and across the Internet. Whether the music data is encoded in MP3, AAC, FLAC, WAV, AIFF, DSF, DFF, etc. matters not because with the right software, any format can be generically decoded freely so long as the encoding is open and accessible.

With this wealth of audio media encoded in an open fashion, numerous software playback options, including several free or “open source” options exist. One also has the freedom to choose from a multitude of hardware (not just computers and DACs, but cell phones, digital audio players, audio streamers, home-theater receivers). For those willing to invest some time, one could go even deeper and explore sophisticated fine-tuning of playback with DSP techniques for example. The market has also provided entrepreneurs with opportunities to create turnkey server and playback systems to cater to different needs and at various price ranges. In a world of freedom and innovation, we see MQA aiming to disrupt the status quo as a new “format” to succeed over others; one that the company insists is capable of ideal universal utility, the one format that music labels can use to “guarantee delivery of the Studio sound”.

Let us take a step back and consider… Prior to the introduction of MQA, was there a collective desire among audiophiles for yet another data format to fulfill service gaps? Were there many audiophiles or music lovers requesting that their DACs have an “authentication” indicator? Did many people complain about major format incompatibilities (other than maybe iTunes with FLAC)? Were there complaints either among the consumers or in professional circles that high-resolution PCM and DSD sounded suboptimal? I think many would answer “no” to each of those questions. Many times, I have seen MQA described as being “a solution in search of a problem”.

The company Meridian initially targeted MQA as a data format for “high resolution” streaming (5) to capitalize on the growth of streaming services (were there even many consumers asking for high resolution audio streaming over the Internet?). The claim being that MQA would reduce data rate to something more manageable than native high resolution PCM and at the same time deliver better-than-CD quality sound (6). Certainly, this is a worthwhile goal as an engineering exercise, but there are many ways of achieving this without actually creating a new proprietary data format. For example, we can already achieve bitrates similar to MQA with higher-than-CD quality using “free” and open file formats like FLAC. How about lossless compressed 18-bit 96kHz FLAC as described by Miska? In fact, using the same data bitrate as MQA, would most audiophiles streaming music not be satisfied with simply lossless compressed 24/48?

Though streaming might be the prime target for MQA, aspirations appear to be even broader (we will talk more about this in part 3 below). Music download sites have been willing to sell these files (7) and over the years, we have even seen claims of MQA-encoded CDs being sonically beneficial (8).

While the company has made the MQA data format “compatible” with standard PCM playback (9), they claim that when properly decoded, whether through computer software or using a compatible DAC with the appropriate firmware, the sound quality will be at the level of the “original” high resolution master source (which could have been at 24/192 or even higher like DXD 24/352.8). This leads us to a second major point of contention…

 

 

2. Is MQA sonically “better”?

Among the multitude of music lovers out there, audiophile hobbyists are those who most desire progress in sonic fidelity. If there are benefits to be gained, “we” will typically be the ones most interested in incessantly and passionately exploring the potentials and possibilities. Perhaps this was the rationale for why MQA was so strongly promoted to the audiophile press who then proceeded to “push” the product among hi-fi consumers. However, we must remember that the audiophile hobby itself is deeply divided among participants around epistemic authority (ie. how do we actually figure out what is truly valid improvement and progress in sound quality?). There have even been papers written about the claims to knowledge and the tensions that exist between the “objectivists” and “subjectivists” (10).

Within this climate of epistemic tension, MQA heightens the strain by challenging established sampling theorem with claims that it “goes beyond Nyquist/Shannon”, provides no objective evidence that it surpasses current capabilities, and even worse, independent objective evaluations have demonstrated that MQA appears to degrade quality as we will soon discuss. Furthermore, the majority of strong positive testimonies in support of MQA seem to be coming from those who have a relationship with the Industry (either personally or out of mutual financial interests), typically are more committed to subjective-only assessments, or some combination of both.

From the start, MQA insisted that their techniques achieve “studio sound”. They also claimed that it’s “lossless” yet “compatible” with current playback systems. The claim is immediately hard to accept given the implications of the data reduction. How is it that something can be truly high resolution lossless, be backward compatible, and do it with even fewer bits than already-efficient compression algorithms?

Though not necessarily the exact inner workings of today’s MQA encoding system, the patent diagram from December 2013 gives us a valuable glimpse into the nature of the scheme:

 

Figure-7A.png

 

 

For all to see right on that diagram is the fact that this system gives preference to a certain number of most-significant “baseband” bits (the top 13 bits or so in the block diagram on the right) in order to achieve the playback compatibility. Then it incorporates a lossy component within the encoded lower bits (“sub-band” bits). Though MQA never admitted to it and audiophile magazines never acknowledged this obvious fact until recently, the system is no doubt “partially lossy” (11).

Without using a decoder, digital subtraction test comparisons between MQA files and original PCM sources do seem to achieve about 13-bits average correlation null depth, which probably means something like 14 or 15-bits of audio quality if we throw in another bit or two for noise-shaped dithering. In early 2017, after the release of software MQA decoding using Tidal, I was able to compare songs that appeared to be from the same master and demonstrate correlation down to ~14-bits in one track, with portions down to ~17-bits once “unfolded” (12).

These results are certainly in line with comments made in MQA interviews that potential bit-depth accuracy has been reduced to less than 24-bits (13) as implied by the block diagram. The exact amount of resolution varies depending on how the music was encoded.

Beyond bit depth, another area of contention when assessing MQA’s sound quality comes from the company’s claim of achieving temporal accuracy; the famed ability to perform “de-blurring” on the music. There are many claims around this including using “neuroscience” as the rationale with the often-quoted value of 5µs threshold of human temporal auditory resolution (14). Whether it’s within articles online or in MQA’s marketing material (15), it is typically suggested (but not always) that filtering is a significant part of the technique used to improve time-domain performance.

Over the years, we have come to discover the nature of the MQA filters themselves thanks to some fantastic work by Måns Rullgård and his exploration around deciphering the “rendering” stage of MQA. Using his insights and software, I posted the various MQA filter impulse responses with the AudioQuest Dragonfly in July 2017.

For this summary article, let us look at the impulse response of the “prototype” MQA digital filter that is applied commonly during decoding and upsampling, found among a number of tested MQA DACs:

 

MQA-Impulse-white-background.png

 

 

There are some problems with this filter from the perspective of high fidelity playback; I’ll just show you a few issues here. First, it is extremely weak and does not suppress imaging (or “up aliasing” as I’ve also seen it called) well. We actually can show this effect quite prominently when looking/listening to MQA-encoded music that began life as 44.1 or 48kHz. Very obvious examples are pop recordings such as this Bruno Mars album below originally of 44.1kHz sampling rate, fed into the MQA encoder and then unfolded to 88.2kHz within the Tidal software. (I first became aware of this issue when I came across this YouTube video awhile back with Beyoncé’s Lemonade.)

 

 

Bruno-Mars-Imaging.png

 

 

Notice that the actual music is filtered off below 22.05kHz (Nyquist frequency of 44.1kHz sampling rate), there is a gap present due to the filtering, then the very obvious imaging artifact is easily seen in the top octave above 22.05kHz (like an attenuated mirror image). These frequencies should not be there and were not part of the “studio sound”. Yes, we are looking at ultrasonic effects here. However, if MQA is being marketed to audiophiles pursuing “perfectionist audio”, why is this obvious distortion acceptable?

Secondly, the minimum phase filter design introduces temporal distortion by creating phase anomalies especially with higher frequencies as discussed recently on my blog. As a summary, here are the group delay graphs using the different filters available on the Mytek Brooklyn DAC, one of which being the MQA filter:
Mytek-Group-Delay.jpeg

 

 

 

Clearly, the MQA and minimum phase settings are not flat lines and they introduce varying amounts of group delay on playback especially with the higher frequencies. What this means is that given the same starting time, an 18kHz frequency component of the sound would actually be delayed by about 40µs compared to a 100Hz tone using that MQA filter on a 44.1kHz sample. Sure, we are only talking about microsecond differences, which would be significantly reduced with 88.2/96kHz material, but the point is that this was supposed to be a system that improved time-domain characteristics! If indeed the system is “de-blurring”, presumably they have some way to deal with the group delay introduced during playback. As far as I am aware, there has been no technical demonstration to show evidence of actual “de-blurring”.

Thirdly, as a filter, tests have demonstrated that MQA’s processing (at least with ESS-based DACs like the Mytek Brooklyn and AudioQuest Dragonfly) seem to have a higher tendency to suffer from intersample overloading. Here is an example using the Mytek Brooklyn DAC again:
Mytek-Filters.jpeg

 

 

 

These are overlaid graphs of a 20kHz 0dBFS sine wave, wideband white noise, and the noise floor recorded off the Mytek Brooklyn DAC using 44.1kHz signals with the different filter settings. This kind of graph is often shown in Stereophile reviews as a way to characterize the effects of reconstruction filters (see description of the “Reis Test”).

Notice the distortion introduced by the 0dBFS 20kHz tone in the form of multiple distortion peaks with the MQA filter. These are obviously artifacts of the reconstruction filter, likely created by overloading from intersample peaks. None of the other filter settings do this. This kind of behavior may be significant with modern productions where the average volume is loud and dynamic compression is high. Again, it begs the question as to whether MQA represents a step forward in high-fidelity reproduction, and whether a filter design such as this should be implemented broadly across numerous devices when clearly the other options here appear to be better.

 

Finally, we can see the effect these distortions make with the Mytek Brooklyn DAC culminating in worse total harmonic distortion and noise (THD+N) than the other filter options for this device:
 

Brooklyn-Filters-THD-N.jpeg

 

 

 

Notice that this amount of gradually increasing THD+N starting below 10kHz which goes up to 1% at 20kHz is consistent with other MQA devices like the Meridian Explorer2:
Explorer-2-THD-N.jpeg

 

 

Unfortunately, on the Explorer2, one does not have the choice to switch to another filter.

It would be unfortunate if MQA ends up being the “default” or only filter for a device given its relatively poor performance when playing standard PCM. Arguably, it would be unfair to compare standard PCM vs. MQA using this filter as good PCM playback would generally use settings with fewer distortions on a good DAC. In my opinion, DAC manufacturers that incorporate MQA need to make sure that MQA filters are not active by default. They should make the MQA filter easy to turn off if engaged, and the only time engaged is with actual MQA decoding/rendering.

Realize that others have raised these concerns as well. To name just a few, Jim Lesurf took note of the “lazy filter shape” in June 2016 along with exploration of aliasing components. Bit-depth reduction and filter anomalies were identified in Xivero’s detailed “Hypothesis Paper to support a deeper Technical Analysis of MQA” (early 2017) where they went even deeper into the patent texts, discussed time and frequency-domain equivalency, and explored alternative compression schemes without the need for a decoder like MQA. Doug Schneider in SoundStage! reported on these anomalies; as far as I am aware, this is the only audiophile publication that has discussed and acknowledged the existence of these “false frequencies” in a timely manner.

While there are a few other findings I can point out, I will leave the reader to explore these other issues elsewhere. Suffice it to say, there is no clear objective reason to think that taking a high resolution “studio master” file, running it through the MQA encoder which drops the actual bit-depth, and then decoding and upsampling using their weak reconstruction filters would result in higher fidelity playback through one’s DAC. And if there are objective explanations for how the sound can be made “better”, in my opinion, MQA is clearly not doing a convincing job explaining the technology despite their attempts with debatable charts, graphs, and impulse responses.

It would be fair at this point to ask: “Is it possible that subjectively MQA is clearly better – like what the audiophile press wrote about when they heard MQA?”

Unfortunately, apart from what seems to be limited, mostly closed listening sessions, I don’t know if MQA Ltd. has been “brave” enough to demonstrate A/B comparisons to broad audiences. In fact, it was rather disturbing that through 2016, the MQA audio show demoes consisted of simply MQA files being played without even comparisons with standard CD-resolution material (16). They even tried to explain this away in interviews. This in itself might not be as frustrating if it were not for the magnitude of almost euphoric claims from reviewers and magazine writers insisting no less than world changing “paradigm shifts” that would benefit the consumer.

In mid to late 2017, I decided to try an “Internet Blind Test” using actual MQA Core-decoded audio (by capturing Audirvana+ output) with simulated MQA filtering using some demo tracks from 2L so listeners who can play 24/192 high resolution files can try to experience the difference MQA could make. Remember, “de-blurring” could have been demonstrated without special encoding or “origami” folding if MQA released 24/192 files with the “effect” baked in. With 83 respondents worldwide, there was no significant preference towards the MQA decoded version compared to an equivalent 24/96 high-resolution sample (17). On the one hand, this is good as it implies a level of transparency. However, certainly this was nothing like claims of “obvious” audible differences expressed in the press that MQA was “better than Hi-Res” due to the de-blurring and such!

Over the years, others have documented in good detail subjective listening tests (18). We are at this point awaiting results from McGill University announced to be running listening comparisons between MQA and un-encoded audio (announcement of such tests in October 2017). Let’s see if they find clear differences.

 

 

 

3. A broad “philosophy” – cui bono?

Finally, we are confronted by MQA’s claims that they are promoting not just a “format”, but also a “philosophy” (19). They see it as a philosophy of breaking free from adjudicating quality as represented by traditional objective parameters like bit-depth and sample rate; that file size and bitrate does not correlate with sonic quality. Based on this view, MQA has determined that everything captured in the studio and what humans can hear can be “encapsulated” in the MQA 24/48 combination lossless-lossy container. In other words, they’re arguing that they know the full ability of human hearing based on “tremendous advancements … in neuroscience” and that as a result, a file format does not need to include the full bit depth (noise floor) or full lossless frequency response (sample rate) as in an original high resolution studio recording. If this is true, music labels can then just release all “hi-res” material in this single compressed file type.

As much as MQA might detest comparisons, this is also no different from the basic goal of lossy encoding and implementing psychoacoustic understanding to audio compression as per MP3. The problem is that MQA refuses to acknowledge this! They seem to fear using the term “lossy” when by definition the encoding process is unable to exactly reconstruct the high-resolution data fed into it on playback.

MQA also dissuades us from comparisons being made in the digital domain probably because this will show that the data doesn’t exactly maintain the original source quality; preferring to defer to some nebulous concept of an “end-to-end analog solution” (20). As I had recently expressed, this is not a full end-to-end analogue system if they cannot account for preamps, amps, speakers, and room anomalies simply because those are the components and factors most likely to affect what we ultimately hear! Furthermore, you would think the analogue output from two MQA DACs would appear to be “more similar” when decoding an MQA track, right? After all, it is all supposed to be “authenticated”. Alas, using a high quality ADC to record the output from a Meridian Explorer2 and Mytek Brooklyn DAC, the comparison results did not reveal any special correlation in sound quality between an MQA decode and standard PCM playback. As such, I have seen no evidence that MQA recorded from the analogue DAC output helps the listener approach some kind of idealized “studio sound” target (21).

The fact that there is a lossy element as well as distortions as shown above is ironic considering MQA has as its strongest supporters reviewers and magazine writers who seem to have unwavering faith in their own subjective assessments of sound quality. Many of these individuals feel they can hear differences between cables and unusual “tweaks” of all kinds that are objectively unsubstantiated or unquantifiable. Yet when something is clearly quantifiably adding distortion and reducing resolution like MQA, these same individuals seem to describe “obvious” improvements!

The ideas around end-to-end analogue, the claim that it can correct digital and time-domain errors that would benefit listeners, and the insistence that their “origami folded” 24-bit 44.1/48kHz files can deliver “master quality audio” are statements of faith around that philosophy being promoted. This might all sound good as talking points and for running ads but clearly lacking in concrete substance when we peer a little deeper.

But wait, so far, we’ve only touched on one part of the “philosophy” promoted by MQA. Much of the rest of their philosophical ideas revolve around an uncomfortable business model that reaches broadly, affecting the whole production and playback chain. In February 2017, Linn was bold enough to post that they saw MQA as nothing more than an attempt at a “supply chain monopoly”. The result of which is a “tax” on hardware, software and the media, ultimately passed on to consumers of course. Should this “philosophy” be broadly accepted, and the business model successfully implemented, it would no doubt be good for MQA Ltd.’s financial statements.

But who else might gain from this “philosophy”? I think we have to look at why the “Big 3” music labels seem to want to “get in” on this system. Warner Music was the first to make an agreement in May 2016, followed by Universal in February 2017, and Sony Music in May 2017. These entities control about 75% of the music market.

Connecting the dots, we see that Spencer Chrislu (MQA Director of Content Services) acknowledged in August 2016: “If a studio does their archive at 24-bit/192kHz and then uses that same file as something to sell on a hi-rez site, that is basically giving away the crown jewels upon which their entire business is based” (22). What this basically implies is that MQA is a way to defer release of a full resolution “studio master”. An opportunity to sell music lovers a version that must not by definition hold the full value of said “jewels”. And so it goes, perceived opportunities to sell the same music yet again because the precious, awesome-sounding, crown jewels are safe in those concealed music vaults…

We must then finally discuss the issue of Digital Rights Management (DRM). I know, MQA does not prevent one from copying the FLAC-compressed file. I acknowledge that MQA does not “phone home” to confirm access in order to play back. But let’s think about the definition of DRM broadly as defined in the Oxford dictionary (good enough definition as any):

 

 

DRM.png

 

Does MQA control digital rights using its technology to prevent “unauthorized” access? Of course it does. It requires licensing of software to decode the proprietary format, hardware manufacturers will need to work with MQA to ensure compliant firmware, and all music will need to be “authenticated” through an MQA-authorized fashion (23). Without authorization, one will have no right/ability to decode, listen to, access, or process the high-resolution data buried by MQA. If one reverse-engineered the decoding algorithm including the “access protection” mechanism and released the software to do this without obtaining permission/licensing, one would of course expect to be contacted by MQA’s legal team.

Remember that MQA’s “authentication” is not just a common cyclic redundancy check or a flag to turn on a blue light like with Pono (24). It utilizes a 3072-bit signature embedded in the control stream used with a hash of the audio data and presumably some sort of key within the decoder software/firmware (25). If desired, it appears the encoder can be instructed to limit the quality of undecoded playback (by selecting the bit-depth of the control stream, below which resides the encoded data). Furthermore, at least in previous versions of MQA firmware last year, there was evidence of an ability to descramble purposely-affected data streams.

The concept of embedded keys and having provisions for variable audio quality is not foreign to Meridian’s way of thinking considering their patent in 2014 (26) aiming to provide “conditional access to a lossless presentation” and “control over the level of degradation of the signal”. Even though the mechanisms described in the patent are currently not implemented in MQA, there is nothing to say they cannot be built within the infrastructure being created. Remember, in time, if MQA were to be successful, there would be increasing control over authorized playback software and device firmware across the product lines of various manufacturers. Since these are all reprogrammable software algorithms, currently absent “features” could be incorporated.

 

 

The view from 30,000 feet and a birth of a new paradigm?

Obviously, I have been presenting an opinion (with evidence) from a perspective that is far from flattering to the claims made by MQA. As a consumer and participant in this “music lover” “audiophile” hobby, I admittedly see very little to gain and clearly much to lose especially for the consumer’s freedom of choice.

But what of MQA’s supporters? As time goes by, with each article published in the mainstream magazines, even those who seem to be in support of this “format” have finally made it clear that MQA is “lossy” in nature, seem uncertain of the claims about it sounding “better” than the original hi-res audio, and are not particularly convinced about the “neuroscience” claims. For example, let us have a look at Jim Austin’s latest article in Stereophile (March 2018 issue).

That article actually summarizes well many of the suspicions that I and other critics of MQA have raised over the years and summarized above. The “case for MQA” from supporters sound very much like an apologetic defense for the music industry. The Stereophile article plainly asserts that the desires of audiophiles really do not matter: “The best we can hope for is a system designed to serve the interest of others – the industry, musicians, and casual (mobile) music listeners – but that is also good enough that we can live with it.” Yes, it’s not hard to understand how it may serve the Industry (ie. music labels) with preserving “crown jewels”, maintaining the perception of mystique so as to re-release albums, and DRM potential. But are we sure the artists will make any more money? And since when did the “casual mobile music listener” care about high resolution streaming considering the largest services like Apple Music and Spotify don’t even support 16/44.1 lossless (much less show a desire to increase their bandwidth to deliver 24/48 MQA)?

Furthermore, since when did the audiophile press decide that supporting the Industry was more important than perhaps a bit of prudent objective analysis and being considerate of the inconveniences and upgrade costs that this might pose to their readership? Was there a thoughtful debate about this? Are they representing the interests of their readers, truly promoting “perfectionist audio”, or did they perhaps jump the gun a bit without thinking things through? By the way, can someone explain to me why I should consider investing in those $75,000 speakers, $3000 interconnects, and $2000 power cables advertised in the magazines if in the near future, perhaps the only new “high resolution” music I could buy is of the “good enough” and “lossy” variety?

Jim Austin even said this: “Buy those 24/192 downloads while you can”. Why not also suggest: “Buy those 24/176.4, 24/96, 24/48, 24/44.1, unaffected 16/44.1, DSD downloads while you can”?

Imagine a world where MQA is wildly successful and the only new digital releases from the major labels are in MQA. You can stream MQA, you can buy the MQA files, and even CDs are MQA-CD (“Buy those unaffected CDs before they all become MQA-CD remasters!”) (27). The unsuspecting music lover who has never come across a critical article on MQA might be impressed initially that these are supposedly “hi-res” 24/48 MQA files or told that the 16/44.1 MQA-CD contains some secret sauce that makes it sound amazing. Initially, the sound quality might be okay on all the equipment he/she owns. But over time, the encoding system starts to degrade the sound of the undecoded data. At some point, what if the undecoded file becomes something like only 10-bits resolution unless it’s played back through an MQA certified device?

Before you accuse me of paranoia and courting conspiracy theories, we know that the MQA decoder already has a wide tolerance for how many bits were devoted to the PCM “baseband” on the encoding side (this is useful because some music may only need 14-bits so they can devote more data for optimizing “sub-band” MQA encoding). It’s not really a matter of whether the system can be made to “control over the level of degradation of the signal” through the encoding-decoding system, but rather a question of what assurance the consumer has that it will not be used for the purpose of forcing “obsolescence” on playback systems that do not implement MQA. Is it wise to accept this level of potential control in the long run by signing on to a closed system?

Finally, suppose MQA enjoys a period of relative success and one buys a library of encoded albums. What happens if MQA for some reason goes out of business? Without updates and new devices incorporating the decoder, unless someone finds out how to decode MQA so it can be fully converted to standard PCM (like how HDCD generally can these days), that library of “high resolution” files might end up undecodable in future playback systems. This is one of the perils of orphaned DRM. Who knows if one might even see the rise of MQA 2.0 encoded media with “even better” quality that current devices do not fully decode or if the manufacturer is unable to provide updated firmware to support. What then? Buy yet another device that supports the newest “standard” when all along free and open options were always available?!

As it has been said, the price of liberty is vigilance. The debates and questions raised here and elsewhere in my opinion are all part of the due process of assessing the value of this “philosophy” and how it affects the quality and freedoms we currently enjoy as music customers. This is true not just for today and MQA, but worth considering for whatever might come our way down the road.

Speaking of roads, I noticed that Jim Austin began his recent article with a quote from Yogi Berra – “When you come to a fork in the road, take it.”

While catchy and cute, this of course does not apply here. The idea of facing a fork in the road is a false choice and at best wishful thinking for those promoting MQA. The “road” is already “well paved”; open, free, mature and robust (file formats), this highway already allows broad creativity and innovation without major licensing impediments especially for smaller companies, and has enough lanes to accommodate the needs of music lovers whether they’re happy with MP3 or desire huge DSD256 downloads. In my opinion, MQA is an optional turn-off at this point with little content (28) leading down an unpromising dimly lit narrow path with toll booths along the way. Should we bother with this detour?

Ultimately, remember that the music industry can be wrong, audiophile magazines can be wrong, as an individual, I can be wrong (and my wife says I often am!). But the consumer is always right – which is exactly why “we” call the shots. Let’s see how this goes…

 

 

 

 

Acknowledgements:

I would like to thank Måns Rullgård (mansr) and Mitch Barnett (mitchco) for their generosity of spirit, allowing me to pick their brains, for providing stylistic suggestions, and for their time in reviewing this article. Also, a big thank you to my audio engineering friend for his invaluable insights, and willingness to run some MQA DACs through his Audio Precision gear for many of those graphs and measurements.

 

https://audiophilestyle.com/ca/reviews/mqa-a-review-of-controversies-concerns-and-cautions-r701/


 

Footnotes and Further Reading:

  1. I openly admit that passion can be a complex affair in audiophilia as discussed in MUSINGS: Passion, Audiophilia, Faith and Money.
  2. Early articles included entries from Stereophile and The Absolute Sound dating back to December 2014. Papers written by MQA include: AES paper “A Hierarchical Approach to Archiving and Distribution” (October 2014) and the JAES article “Sound Board: High-Resolution Audio, A Perspective” (October 2015).
  3. Such as listening sessions hereherehereherehere, and here. Notice that The Absolute Sound even declared MQA “better than Hi-Res!” on an issue cover and the editor Robert Harley even states that “MQA is the most significant audio technology of my lifetime.”
  4. Even to the point recently in February 2018 where it was claimed: “most DACs can’t play MQA files, so they are already obsolete”.
  5. For example, the general technology site Trusted Reviews expressed the idea that MQA was aiming to be the “lead format” for streaming in early 2015. Note also that MQA Ltd. has since spun off independently from Meridian.
  6. Remember that although MQA can be encoded at different bit depths and sample rates, the typical stream is equivalent to 24-bits and 44.1 or 48kHz which can then be losslessly compressed with something like FLAC. The actual size of an MQA stream is therefore typically at least 30% larger than a lossless compressed 16/44.1 CD-quality PCM file.
  7. Like 2L and e-Onkyo Music. Interestingly, at one point in March 2017, HIRESAUDIO supposedly intended to stop selling MQA but I see they still have a number of MQA albums online.
  8. MQA-CDs!? Apparently, this is a good thing
  9. Basically, the most significant bits of an MQA file is unencoded PCM (the “baseband” audio bits) so when you play these files through a standard DAC, the quality is claimed to be around that of 16/44.1 or 16/48 audio. The “sub-band” data bits, which are used to encode the MQA “hi-res” data, will act as low-level noise. More details later in this article.
  10. For a good academic review, see this paper by Perlman, in Social Studies of Science 2004 – Golden Ears and Meter Readers: The Contest for Epistemic Authority in Audiophilia.
  11. Finally, Jim Austin in Stereophile (March 2018 issue) acknowledges the fact that MQA contains lossy elements. See also my October 2016 article: MUSINGS: Keeping it simple… MQA is a partially lossy CODEC.
  12. Undecoded comparisons were made in 2016 when 2L released samples. Then in 2017, I compared tracks from Madonna, Buena Vista Social Club, and Led Zeppelin. Admittedly, my sample size for this is small and perhaps more work can be done to further explore bit-depth correlations in the future if anyone is still interested.
  13. See this interview with Bob Stuart where he describes MQA’s bitdepth as “typically 15.85” and “up to 17-bits” at 31:05. Although his claim is that these numbers reflect undecoded performance, I have yet to see evidence in actual music playback that MQA decoding retains >17-bits of resolution.
  14. The 5μs value showed up in the MQA Q&A article in Stereophile (August 2016) and I think is erroneously referenced. The best I can find is that this is referring to papers such as Kunchur’s “Audibility of temporal smearing and time misalignment of acoustic signals” (Technical Acoustics, 2007) with an estimated threshold down to around 6μs.

It is already understood that even 16/44.1 CD-resolution digital is capable of time-domain resolution of 110ps and MQA accepts the figure of 220ps.
  15. A good example of the link made between temporal accuracy and filters is this Sound-On-Sound article published in August 2016.
  16. Here’s a report from LAAS last year supposedly with an A/B test. I attended one of these disappointing MQA demos at the 2016 Vancouver Audio Show.
  17. You can read about the blind test “Core Results” here. There were also subgroup analyses where I could not find a preference even with audiophiles using more expensive gear. Finally, some of the subjective comments might be interesting. Notice that although not statistically significant, in many of the comparisons, there was a slight preference for the non-MQA version.
  18. Here is a listening test from August 2017 that I thought was well written and described at the Airshow Mastering Room studio.

Of note, there was a comment on Bruno Putzey’s Facebook page that MQA themselves did not run scientific tests. At this point, I have not seen any evidence in the literature or “white papers” from the company itself of controlled listening tests.
  19. You can read more about MQA described as a “philosophy” by Bob Stuart, expressed in this interview.
  20. This article in AudioStream (January 2016) suggests that MQA is capable of correcting digital anomalies and time domain anomalies by characterizing the whole audio production chain and playback components like “DSP loudspeaker”. Who knows, there’s a small chance that a full chain like this down to the level of these DSP speakers may achieve a higher level of accuracy, but this would be a rather rare and atypical system.
  21. Speaking of “authentication” from the perspective of the “studio sound”, what does this even mean? Just like the home user, each studio will have their own set-up with speakers, amplifiers and mixing consoles among multiple other devices used during production. Is there such a thing as a standard “studio sound” using MQA certified speakers? Of course not. Artists and engineers weave their magic using what they have at their disposal to create what was intended without needing to think about how MQA would “de-blur” the sound. As demonstrated earlier, MQA has the potential to alter that sound and the resolution captured in the studio. MQA supposedly delivering the final sound “as the artist intended” is more than a little hard to believe.

Recording industry members like Dr. Mark Waldrep (Dr. AIX) and mastering engineer Brian Lucey have been vocal about concerns regarding MQA. Worth reading their comments and impressions.
  22. This is of course the (in)famous “crown jewels” statement. Seriously folks, what crown jewels!? Sure, there are some good sounding recordings in the archives. But are they referring also to the multitude of slowly but surely degrading old analogue tapes in storage that have been re-released ad nauseum? Old digital recordings from the 80’s and 90’s done with archaic ADCs? Or some of the new recordings done with Pro Tools, many probably highly dynamically compressed?

Even if a label releases the “studio master” 24/192, that doesn’t mean a hi-res remix, or remaster cannot be created in the future. Consider the 2017 Eagles’ Hotel California 40th Anniversary with hi-res on Blu-Ray already released as 24/192 by HDtracks in 2013, or have a look at how many high-resolution variants of Kind Of Blue are out there including PCM and DSD versions.
  23. The underlying cryptographic signatures are provided by Utimaco’s infrastructure. There was also a presentation in December 2017 at the 34th Chaos Communication Congress (34C3) by Christoph Engemann and Anton Schlesinger describing MQA as “A clever stealth DRM-Trojan”. That description is likely quite accurate.
  24. Pono’s blue light was simply a flag that told the device to turn on the LED for files bought through the Pono Store… No actual checks to make sure the music data itself was error-free – see here for more information from the JRiver folks.
  25. Of interest, it appears that only the “baseband” bits are being “authenticated”. In an experiment here by FredericV, when the lower 8 bits are dropped, the MQA “blue light” still shines even though it’s recognized as 16-bit audio. Maybe this is all that MQA-CDs are?

On a related note, doesn’t the fact that this blind spot in the authentication mechanism exists immediately disqualifies the MQA blue light from being something a consumer should have any faith in that the file is of “guaranteed” provenance?
  26. Versatile Music Distribution” patent February 2014.
  27. First Major-Label MQA CD”. Steve Reich’s Pulse/Quartet. Knowing what we know about MQA, why is this any cause for celebration? CDs are 16/44.1, which means that the MQA control stream with cryptographic signature is now embedded taking up bits that previously would have been for the audio. This could be innocuous if the music’s noise floor is relatively high but it’s not like a 24-bit MQA file where there’s space for some amount of lossy-encoded data in the “sub-band”. It would be very interesting to compare the MQA-CD output with an actual 24/96 FLAC or even an unaltered standard 16/44.1 file.
  28. Recently on my blog, a reader posted a survey of Tidal finding 7406 unique MQA albums as of early February 2018. Considering that there is something like 48M tracks on Tidal, and say there’s an average of 12 tracks per album conservatively, this means that only 0.2% of Tidal content is MQA-encoded material.

Share this post