When you’re a music-loving kid and the cost of one vinyl LP – in those days it was either that or the audio cassette at which my friends and I sneered – was roughly the cost of eight hours not-so-hard part-time labour at Woolworths, you soon became resentful of the surface noise on those vinyl LPs. So when the compact disc was launched, it seemed like a liberation from the tyranny of clicks and pops.
Of course, many high-fidelity enthusiasts soon turned against the CD, sometimes against digital entirely. But not me. I’ve come back somewhat to vinyl over the past half-decade – and I’ve even gotten hold of some shellac and a device on which to play it – but digital is still 99.9% of my listening. And perhaps 98% of that is from TIDAL, stuff that I’ve ripped from disc to my NAS in FLAC, or DSD or other high resolution content I’ve accumulated from various sources.
But I still have my CDs, and sometimes it’s nice to spin them up. To this end, I’ve kept my Denon DCD-755AR CD player. This was kind of one step short of top-of-the-line back in the day. That day was in August 2000, almost 21 years ago. The player still works, and still measures quite respectably. And it also has an optical digital output. Which means that you can use it as a CD transport.
Or can you?
So, last week I was testing a high-end DAC. It has network streaming as well, and for reasons I don’t recall, I decided to test its output not just with my test signals streamed over the network, but delivered to its optical output. I plugged in the Denon CD player via optical and ran the same tests.
When running the test with 16-bit, 44.1kHz signals from the inbuilt streamer, the noise level was at an impressive -97.8dBA. That’s about as good as it can get with 16-bit audio. Really. Mathematically. 16-bit audio has an intrinsic noise floor of -96.3dB.
But when I ran the test with the exact same signal burnt to a CD-R and played on the Denon, with the signal being fed to the DAC via optical, the noise level was at an unbelievable -115.4dBA. Here’s what it looked like, compared to the noise floor when streamed:
I mean “unbelievable” literally. That’s the kind of result I typically get when using 24-bit test signals, but the test signal on the CD-R was 16 bits, of course. What could be going on?
Well, for this particular Denon player, it was about the Alpha Processing. Denon was – still is, I suppose – quite proud of its Alpha Processing. This CD player has the first version. Later it was enhanced to AL24 Processing, then a few steps later to AL32 Processing. The original might as well have been named AL20 Processing for here’s how Denon describes it:
Alpha processing was the world's first technical formula for reproducing 16-bit data in 20-bit quality. The ALPHA processor interpolates the data recorded on the original CD so that the waveform is more natural. This will result in a more pure waveform that it is much closer to the CD digital signals without any processing.
Of course, “natural” is a marketing term. Interpolation is where you take two data points and estimate what’s supposed to be between them. You can use an average, or take into account other data points to develop some kind of weighted average. That said, it remains merely an estimate.
A slight digression
Have you watched some police procedural show from the 1990s or early 2000s – they seem to have stopped doing this now, thankfully – where there’ll be security footage of the criminal in action, and the lead detective says to the tech guy, “enhance!” And soon enough, the indecipherably fuzzy blob of a car’s number plate sharpens up into a pretty decent high-res image. And, of course, by episode’s end, the bad guy has been caught.
Well, the reason that they have stopped doing that in these shows (I think) is because it is transparently ludicrous. You can’t reconstruct what’s between the actually captured pixels. Take a low-resolution image into Photoshop and apply all the filtering and sharpening processes available, and you wouldn’t be able to generate a usable image.
The same applies to digital audio. Any interpolation is necessarily a guess. And, in reality, the mere process of digital to analogue conversion has the effect of creating an interpolation anyway.
The lesson to be learned
So, what I am saying is that interpolating from 16 bits to 20 bits doesn’t produce anything useful. But what it may do is change the apparent noise floor in certain test signals. If, for example, you have a 16-bit test signal which contains a section of “digital zero” for testing the noise floor, this may indeed to be resampled – from a 16-bit digital zero to a 20-bit digital zero.
As noted, the intrinsic noise floor for 16-bit audio is 96.3dB. For 20-bit audio it’s 120.4dB. That nicely encompasses the actual 115.4dB measured result.
So it seems very reasonable to conclude that the Denon DCD-755AR CD player not only applies its Alpha Processing to its analogue outputs (measured noise: -113.3dBA) but also to its optical digital output.
And that’s a problem.
What do you want in a CD transport?
The DAC I was using retails at $5400. I chose it because of its superb attributes as a DAC. (Let’s disclose which DAC I’m talking about: the Simaudio Moon 280D – review to follow this week.) I want it to perform its magic on the unchanged, unadulterated digital audio from the CD transport. In other words, I want a CD transport to extract digital audio from a CD, perform any required (hopefully none) Red Book correction to the stream, and then deliver that to the DAC. I want the stream to be bit perfect. What I don’t want it to do is insert its own judgement on how the digital stream ought to be “improved”.
I don’t have a great collection of CD decks available to see if this is a widespread problem with CD players. But I do have a few DVD and Blu-ray players with optical digital audio outputs. I grabbed one – an Oppo unit from about fifteen years ago – and ran the same test. It gave a similarly high – and wrong – noise level result. I suspect that this kind of thing is built into the chipsets of a lot of CD players.
But I still wanted to confirm that the 280D would act as expected if fed a proper 16-bit signal via its optical input.
I searched around. Did I have a gadget that wouldn’t do some kind of manipulation of the digital signal before delivering it optically? It turned out that I did: a Google Chromecast Audio. It also delivered a -97.8dB noise floor. Indeed, the graph of the noise almost perfectly overlaps that when I streamed the test signal (and also when delivered via USB from a computer):
The Chromecast Audio is a relatively naïve device that doesn’t try to achieve a more “natural” sound through increasing the bit depth. So, perversely, this cheap gadget delivers a more accurate digital stream than the nominally better quality Denon CD player.
Okay, the noise floor is lower on artificial test signals … so what?
Ah, if that were the only change. The increase in bit depth from 16 to 20 bits had side effects. Essentially, it increased both harmonic distortion and intermodulation distortion by an order of magnitude. THD for the Moon 280D streaming the signal, and also for when it was fed via USB, was 0.00034% Note, three zeros after the decimal point. When fed from the Denon player via optical, it was 0.00255%. That’s 7.5 times higher. IMD when streamed or supplied via USB was 0.00357%. From the Denon via optical it was 0.053%, or nearly fifteen times higher!
(And, no, it’s not the optical input that’s at fault. The Chromecast did increase the THD and IMD somewhat – to 0.00069% and 0.00362% respectively – but to nowhere near that caused by the Denon. And let’s face it, a Chromecast Audio, as convenient as one can be, is no-one’s idea of an audiophile device.)
There are consequences to the manipulation of digital audio stream. Let’s look at the THD graph:
The green trace is weird. That’s the Denon feeding the optical input of the DAC. I think that those bumps around the 1kHz test signal suggest that the Denon CD player is getting rather jittery in its old age. But what’s interesting here are the spikes at 3kHz, 5kHz, 7kHz and so on all the way through to 21kHz. It seems that all the distortion components are odd-order ones, which are the more objectionable, sound wise. Especially the higher order ones. To be clear, they are still very low in level – the -98dB spike of the 3rd harmonic is 0.0013%. But they are completely absent when the Moon 280D is fed a clean signal via its internal streamer, and even with the Chromecast Audio, there’s only a little unobjectionable 2nd harmonic (it’s at 0.0004%).
Now, I have a total of N=1 when it comes to testing the digital output of CD players. The fact that DVD players only a few years later were also manipulating the bits for digital audio output suggests that the problem – which is what I term it – was widespread. Some kind of interpolation may well have been built into many standard chipsets.
What I want is a bit-perfect extraction of the contents of CD to be delivered to my DAC. It’s possible that some regular CD player may be capable of delivering that, but how would you know unless you’re in a position to measure noise levels to better than -100dBA.
As for me. I’m searching out a CD transport. It will be useless without a DAC. But I think that the Simaudio Moon 280D deserves a real, extracted, unadulterated stream of bits from a CD, not some fake “reconstructed” attempt at an “improved” stream.