Hi-Fi Life (so far)

The Dark Ages

My first awareness of Hi-Fi came from glossy advertisements in the Sunday supplements. The age of the “music centre” had (just) passed and the in-vogue technology was of the “tower” – preferably that lit up like a Jodrell Bank control console, and which contained a record deck, cassette tape player, amp, tuner and the inevitable “graphic equalizer” – all preferably encased in a cabinet with a smoked glass front. I particularly remember the Philips “Black Tulip” system as seeming particularly desirable.

Philips X70 Black Tulip 3e Set
Philip's Black Tulip Hi-Fi
(image used with kind permission of Vintage Collection)


Soon, like any self-respecting teenage geek, I learned that in fact in the realm of true Hi-Fi one had separates: a record player, an amp, and speakers. Graphic equalizers were frowned upon, and even tone controls were seen as a bit iffy. Received wisdom was that the most important thing was the source (the record player) – on the not-unreasonable premise that if what was being extracted from the vinyl was no good, it could not be rescued downstream. So began my quest for decent sound, ending (largely through purloining things from my father’s system) with a Garrard 401 turntable (which type – amazingly – still seems to command good money on eBay), and SME Series V tonearm (which – even more amazingly – is still available in a newer incarnation starting from £2,050) together with some fiddly moving coil cartridge. Everything sounded okay in its own terms but the trouble came comparing this “Hi-Fi” to live music. I’d go to London and listen to Klaus Tennstedt conduct the London Philharmonic Orchestra in a Mahler symphony, then come home and listen to the same forces perform the same work on record; there was no comparison. Nor was the problem limited to this setup, every other “audiophile” setup I heard exhibited the same kind of problems, which were (as Ken Rockwell recently wrote in typically forthright fashion) essentially caused by LPs:

LPs are awful. Audiophiles are often hoping that I'm endorsing LPs, but no. These "plastic dog plops," as one mastering engineer referred to them, are loaded with noise, wow & flutter, distortion, echoes, room feedback and even pitch changes from never being pressed on-center. LPs usually have their lows cut, or at least summed to mono. Some people prefer the added noise and distortion, much as a veil hides the defects in an ugly woman's face, allowing our brain to fill-in what we want to see. 

Perfect Sound Forever?

Soon after CD was launched commercially as audio format, it became clear which way things were heading and so in 1984 I happily sold off my LP collection and purchased a Philips CD100 top-loading CD player. It was odd recently to take my son to the Science Museum and see one of these units as part of a “how we used to live” exhibit! This CD player, with its signal routed (via potentiometer) into a home-made power amplifier (J Linsday Hood design like this) and feeding a pair of cheap-but-good KEF Coda III speakers clearly offered a step change to anything I’d heard from LP-based systems, although amazingly some people claimed at the time – and still claim – LP could complete with digital sound; either they had something very wrong with their systems or else never listened to live music!

This kind of combination (with occasional modest improvements from upgrading the player or the speakers) was my “Hi-Fi” for the next 25 years – although one significant improvement came in year 21 from the addition of a subwoofer (in this case an MJ Acoustics Pro 50 Mk II). It takes a while to get the levels set right, but a properly integrated sub-woofer not only fills in the all important bottom octaves, but seems to make the whole sound (for example in orchestral music) more realistically airy.

In the Power Amp
Inside a home-brew power amp

But how “hi” was the “fi” of this system? Although pretty good it always seemed there were a few things not quite right…

  • Character: I tend to classify loudspeakers as “happy” or “sad” – and the distinction sometimes only becomes apparent after long acquaintance. In general I prefer (happy) speakers with good presence and a tendency to warmness – speakers with a recessed middle ultimately give less pleasure even if their “sparkling” treble and/or deep bass may have sounded good at first.
  • Volume handling: as speakers get louder (at an orchestral climax for example) they can have a tendency to sound more and more strained – and savour more of “speakers” than of the music. This is particularly the case with less-powerful transistor-based amplification – which also tends to impart a certain unpleasant hard character to the sound.
  • Real bass: real bass (as experienced in the concert hall) is not warm, fuzzy and indistinct, but has a musical and visceral quality far removed from the inchoate whumping noises emitted by low-quality sub-woofers.
  • Detail: more is not necessarily good (as sometimes low-fi reproduction can accentuate things a high-fidelity set up won’t), but presentation of different instruments in the overall blend such that one can concentrate on something if one wants to. For stereo recordings, a good strong stereo image helps here as the instrument will be coherently positioned at a location in the sound stage.

Audio File

The problem of too many CDs seemed like a good opportunity for a re-think; if these could all be ripped to a server then streamed, might this also be an opportunity to upgrade the “Hi-Fi” also?

Looking around, it seemed the answer might be yes. In particular, a British firm caught my attention: AVI Hi-Fi and its ADM 9.1 active loudspeakers. AVI’s robust promotional material seemed congenial (its lambasting of cable fetishism is worth a read), and its premises made engineering sense, in particular that:

  • The digital source does not matter – a cheapo off-brand CD player will extract the same bits from a CD as the most exotic “transport” (funny how this turns on its head the old wisdom of the source mattering most)
  • The DAC does matter, hence the best that money can buy should be used
  • Even given a good speaker design, there are advantages to be had from doing away with the need for a passive crossover (as the AVI site has it: “this is fact not hype” !).

The ADM 9.1’s are thus an almost-complete “system” needing only to be fed a signal (preferably bits via their optical input, to take advantage of the high-quality inbuilt DAC). Each unit is a ported two-driver active speaker; each requires mains power; and each contains two amplifiers – 75 watts for the tweeter and 250 watts for the woofer. The master speaker unit accepts line-level input and/or up to two optical inputs via TOSLINK. The connection to the slave unit is via RCA phono cable, and there is an additional line-level output to drive a sub-woofer (I continue to use my MJ Acoustics unit, but AVI also make their own matched sub-woofer for rather more money). The setup is controlled by a simple remote handset which allows for source selection and volume control.

And how do they sound? Well, after a few weeks making sure to eliminate any false first impressions I find them superb. All the traditional problem areas of hi fi reproduction have been addressed, in particular:

  • There is no apparent false warmth; the lower midrange and bass are tight and precise (and even better with a sub-woofer). But “lack of warmth” does not imply these are cool speakers, more that they are neutral … if you’ve ever listened to high-end headphones you’ll recognise the free, uncoloured type of sound. I’ve read some reports these speakers are too clinical. I don’t agree: it’s just fidelity.
  • Volume: these babies go loud, and do so without losing the plot. A nice side-effect of this is that even at volumes approaching “realistic levels” they are not unduly fatiguing (I mean “realistic levels” for classical music!).
  • Detail: the stereo image is particularly solid, and maybe this helps the very strong sense of detail – so for example when listening to a Beethoven symphony the subsidiary background motoric rhythms in the strings are “there” if you care to listen to them.

Most surprisingly, I was expecting the ADM’s to be a merciless revealing lens through which to view the problems of early recordings, but – on the contrary – good quality analogue material sounds better than ever – something like Peter Maag’s legendary 1960 recording of Mendelssohn's Scottish Symphony (for example) sounds gorgeous. I wonder what the reason for this is.

Are they perfect? No, not quite – listening to the human voice one is sometimes aware of a slight cabinet-i-ness to the sound. But this is a picky caveat – what I’m hearing now is the closest to real I’ve ever heard from my own or other systems.

"Perfect Sound Forever" ?
An ADM 9.1 unit with a CD and Logitech Squeezebox. Karajan's Eine Alpensinfonie (pictured)
was the first commercial CD pressed; the rather nasty recording quality of this,
and may other early CDs, became associated with digital recording itself and set
the cause of digital audio back among audiophiles.

The end of the CD

To drive the ADM 9.1’s I use a Logitech Squeezebox. The long process of ripping my CD collection is underway but it is increasingly clear music in future is going to come straight off the net. I’ve just bought the “studio master” 24-bit FLACs of a thrilling new set of Mozart symphonies from Sir Charles Mackerras and the Scottish Chamber Orchestra. From a server upstairs the bits are sent over a wireless network to the Squeezebox and then down an optical cable straight into the ADM 9.1’s. I’m not sure my ears are able to detect the advantage of 24-bit over 16-bit material, but the end result is absolutely riveting musical reproduction. Maybe this kind of system will form the template for my next 25 years of Hi-Fi life …

[Disclaimer: I have no connection with any of the companies or products mentioned in this posting!]

Update — February 2012

I am still as impressed with these speakers as when I wrote this post originally. I have had a query about how well the MJ Acoustics subwoofer works: again, very well. The one tricky thing is getting it properly integrated with the main speakers. My (not very technical) method for doing this is to use a recording with a very well-recorded bass drum, and then adjust the settings until it sits properly in the orchestral texture. My current disc of choice for this exercise is the excellent recording of Mahler's Symphony No 1 by the Pittsburg Symphony Orchestra conducted by Manfred Honeck. The settings achieved by doing this, at least in my listening room, can be seen in the photograph below.

Subwoofer settings

SC 34 WG meetings in Paris last week

The croissants of AFNOR

Last week I was in Paris for a stimulating week of meetings of ISO/IEC JTC 1/SC 34 WGs, and as the year draws to a close it seems an opportune time to take the temperature of our XML standards space and look ahead to where we may be going next.

WG 1 (Schema languages)

WG 1 can be thought of as tending to the foundations upon which other SC 34 Standards are built - and of these foundations perhaps none is more important than RELAX NG, the schema language of many key XML technologies including ODFDocBook and the forthcoming MathML 3.0 language. WG 1 discussed a number of potential enhancements to RELAX NG, settling on a modest but useful set which will enhance the language in response to user feedback. 

A proposed new schema language for cross reference validation (including ID/IDREF checking) was also discussed; the question here is whether to have something simple and quick (that addresses the ID/IDREF validation if RELAX NG, say), or whether to develop a more fully-featured language capable of meeting challenges like cross-document cross-reference checking in an OOXML or ODF package. It seems as if WG 1 is strongly inclining towards the latter.

Other work centred on proposing changes for cleaning up the unreasonable licensing restrictions which apply to "freely-available" ISO/IEC standards made available by the ITTF: the click through license here is obviously out-of-date, and text is required to attach to schemas so that they can be used on more liberal, FOSS-friendly terms. (I mentioned this before in this blog entry).


WG 4 had a full agenda. One item of business requiring immediate attention was the resolution of comments accompanying the just-voted-on set of DCOR ballots. These had received wide support from the National Bodies though it was disappointing to see that the two NBs who had voted to disapprove had not sent delegates to the meeting. P-members are obliged both to vote on ballots and attend meetings in SCs and so these nations (Brazil and Malaysia are the countries in question) are not properly honouring their obligation as laid down in the JTC 1 Directives:

3.1.1 P-members of JTC 1 and its SCs have an obligation to take an active part in the work of JTC 1 or the SC and to attend meetings.

I note with approval the hard line taken by the ITTF, who have just forcibly demoted 18 JTC 1 P-members who had become inactive.

Nevertheless, all comments received were resolved and the set of corrigenda will now go forward to publication, making a significant start to cleaning up the OOXML standard.


The other big topic facing WG 4 was to the thorny problem of what has come to be called the issue of "Strict v Transitional". In other words, deciding on some strategy for dealing with these two variants of the 29500 Standard.

The UK has a clear consensus on the purpose of the two formats. Transitional (aka "T") is (in the UK view) a format for representing the existing legacy of documents in the field (and those which continue to be created by many systems today); no more, and no less. Strict (aka "S") is viewed as the proper place for future innovation around OOXML.

Progress on this topic is (for me) frustratingly slow – ah! the perils of the consensus forming process – but some pathways are beginning to become visible in the swirling mists. In particular it seems there is a mood to issue a statement that the core schemas of T are to be frozen, and that any dangerous features (such as the date representation option blogged about by WG 4 experts Gareth Horton and Jesper Lund Stocholm) are removed from T.

This will go some way to clarify for users what to expect when dealing with a 29500-conformant document. However, I foresee needed work ahead to clarify this still further since within the two variants (Strict and Transitional) there are many sub-variants which users will need to know about. In particular the extensibility mechanism of OOXML (MCE) allows for additional structures to be introduced into documents. And so, is a "Transitional" (or "Strict") document:

  • Unextended ?
  • Extended, but with only standardized extensions ?
  • Extended, but with proprietary extensions ?
  • Extended in a backwards-compatible way relative to the core Standard ?
  • Extended in a backwards-incompatible way ?

I expect WG 4 will need to work on conformance classes and content labelling mechanisms (a logo programme?) to enable implementers to convey with precision what kind of OOXML documents they can consume and emit, and for procurers to specify with precision what they want to procure.

WG 5 (Document interop)

WG 5 continues its work with TR 29166, Open Document Format (ISO/IEC 26300) / Office Open XML (ISO/IEC 29500) Translation Guidelines, setting out the high-level differences between the ISO versions of the OOXML and ODF formats. I attended to hear about a Korean idea for a new work item focussed on the use of the clipboard as an interchange mechanism.

This is interesting because the clipboard presents some particular challenges for implementers. What happens (for example) when a user selects content for copying which does not correspond to well-formed XML (from the middle of one paragraph to the middle of another)? I am interested in seeing exactly what work the Koreans will propose in this space ...

WG 6 (ODF)

Although I had registered for the WG 6 meeting, I had to take the Eurostar home on Thursday and so attempted to participate in Friday's WG 6 meeting by Skype (as much as rather intermittent wi-fi connectivity would allow).

From what I heard of it, the meeting was constructive and business-like, sorting out various items of administrivia and turning attention to the ongoing work of maintaining ISO/IEC 26300 (the International Standard version of ODF).

To this end, it is heartening to see the wheels finally creak into motion:

  • The first ever set of corrigenda to ISO/IEC 26300 has now gone to ballot
  • A second set is on the way, once a mechanism has been agreed how to re-word those bits of the Standard which are unimplementable
  • A new defect report from the UK was considered (many of these comments have already been addressed within OASIS, and so fixes are known)

Most significant of all is the work to align the ISO version of ODF with the current OASIS standard so that ISO/IEC 26300 and ODF 1.1 are technically equivalent. The National Bodies present reiterated a consensus that this was desirable (better, by far, than withdrawing ISO/IEC 26300 as a defunct standard) and are looking forward to the amendment project. The world will, then, have an ISO/IEC version of ODF which is relevant to the marketplace while waiting for a possible ISO/IEC version of ODF 1.2 – as even with a fair wind this is still around two years away from being published as an International Standard.


I'll update this entry with links to documents as they become available. To start with, here are some informal records: :-)



SC 34 WG meetings in Paris next week

Once again I feel that bubbling up of almost schoolboy fervour that presages a set of SC 34 meetings. In Paris no less (AFNOR shall be our hosts): the city of love, fine art and rognons à la moutarde.

What is tasty on SC 34’s menu? Well four working groups are meeting next week:

  • WG 1 (which I convene) will be carrying forward its work on foundation standards – particularly the schema languages of DSDL. We have two new (/proposed) projects to discuss: one a schema language focussed on cross-reference validation; one on associating schemas with documents using processing instructions. Probably our most successful schema language, RELAX NG, is due for an update and several new features are up for discussion: keep it coming!
  • WG 4 (OOXML) will continue its intensive maintenance on ISO/IEC 29500 – not least in handling a new set of approved corrigenda (just voted on) and dealing with the day-to-day grind of correction and improvement. There are larger questions to answer too, in particular those which concern the relationship between the Strict and Transitional forms of OOXML. I have led the preparation of a background paper on this which (thanks to the newly open WG 4 mail archive) can be accessed as a public document (PDF). I predict some lively discussion!
  • WG 5 (OOXML/ODF interop) will continue its work examing how (or not) the two formats may be used by systems which hope to interoperate. TR 29166 - dedicated to this topic - continues to take shape ahead of its projected finishing date in 2011.
  • WG 6 is the newly-created WG dedicated to the JTC 1 side of maintenance of ISO/IEC 26300:2006 (aka ODF v1.0). As a newly-created group there will no doubt be a certain amount of adminstrivia to be got through but there are more substantial issues looming too: defect reports to be advanced and the longer-term project of amending ISO/IEC 26300 to bring it into alignment with ODF 1.1 – there is general agreement that it makes sense to reduce marketplace confusion by reducing the confusing number of standard (and non-standard) “ODF” variants out there, and aligning versions between standards organisations.

Stay tuned (and follow hashtag #sc34 for real-time updates) …

Return to Wicken Fen

Windpump and Sky

Last year I made an autumn photo trip to Wicken Fen; and this morning decided, seeing the light, to have a repeat visit. It was more sombre today, but quite interesting in a muted way I think.

These are three-photo images tone-mapped from HDR with Photomatix.

Guardian of the Fens