Mastodon
Where is there an end of it? | All posts by alex

OOXML and Microsoft Office 2007 Conformance: a Smoke Test


This is one in a series of popular blog articles I am re-publishing from the old Griffin Brown blog which is now closed down. This article is from April 2008. It is the same content as the original (except for some hyperlink freshening).

At the time of posting this entry caused quite a furore, even though its results were – to me anyway – as expected. Looking back I think what I wrote was largely correct, except I probably underestimated the difficulty of converting Microsoft Office to use the Strict variant of OOXML — this would require more than surgery just to the de-serialisation code!


 

I was excited to receive from Murata Makoto a set of the RELAX NG schemas for the (post-BRM) revision of OOXML, and thought it would be interesting to validate some real-world content against them, to get a rough idea of how non-conformant the standardisation of 29500 had made MS Office 2007.

Not having Office 2007 installed at work (our clients aren't using it – yet), the first problem is actually getting a reasonable sample for testing. Fortunately, the Ecma 376 specification itself is available for download from Ecma as a .docx file, and this hefty document is a reasonable basis for a smoke test ...

The main document ("document.xml") content for Part 4 of Ecma 376 weighs in at approx. 60MB of XML. Looking at it ... I'm sorry, but I'm not working on that size of document when it's spread across only two lines. Pretty-printing the thing makes it rather more usable, but pushes the file size up to around 100MB.

So we have a document and a RELAX NG schema. All that's necessary now it to use jing (or similar) and we can validate ...

Validating against the STRICT model

The STRICT conformance model is quite a bit different from Ecma 376, essentially because most of that format's most notorious features (non ISO dates, compatibility settings like autospacewotnot, VML, etc.) have been removed. Thus the expectation is that existing Office 2007 documents might be some distance away from being valid according to the strict schemas.

Sure enough, jing emitted 17MB (around 122,000) of invalidity messages when validating in this scenario. Most of them seem to involve unrecognised attributes or attribute values: I would expect a document which exercised a wider range of features to generate a more diverse set of error message.

Validating against the TRANSITIONAL model

The TRANSITIONAL conformance model is quite a bit closer to the original Ecma 376. Countries at the BRM (rather more than Ecma, as it happened) were very keen to keep compatibilty with Ecma 376 and to preserve XML structures at which legacy Office features could be targetted. The expectation is therefore that an MS Office 2007 document should be pretty close to valid according to the TRANSITIONAL schema.

Sure enough (again) the result is as expected: relatively few messages (84) are emitted and they are all of the same type complaining e.g. of the element:

<m:degHide m:val="on"/>

since the allowed attribute values for val are now "true", "false", etc. — this was one of the many tidying-up exercices performed at the BRM.

Conclusions?

Such a test is only indicative, of course, but a few tentative conclusions can be drawn:

  • Word documents generated by today's version of MS Office 2007 do not conform to ISO/IEC 29500
  • Making them conform to the STRICT schema is going to require some surgery to the (de)serialisation code of the application
  • Making them conform to the TRANSITIONAL will require less of the same sort of surgery (since they're quite close to conformant as-is)

Given Microsoft's proven ability to tinker with the Office XML file format between service packs, I am hoping that MS Office will shortly be brought into line with the 29500 specification, and will stay that way. Indeed, a strong motivation for approving 29500 as an ISO/IEC standard was to discourage Microsoft from this kind of file format rug-pulling stunt in future.

What's next?

To repeat the exercise with ISO/IEC 26300:2006 (ODF 1.0) and a popular implementation of OpenDocument. Will anybody be brave enough to predict what kind of result that exercise will have?

SC 34 meetings in Stockholm last week


Stockholm dawn
The weather was mostly grey, and the days busy, limiting photo taking time …

 

I have just returned from a week of meetings of ISO/IEC JTC 1/SC 34 in Stockholm, Sweden. Here's an update on what happened ...

WG 1

WG 1 met on the Monday (as usual, we "blaze the trail", working out how the public transport works, locating nearby bars, etc).

Our main project over the last 8 years has been the DSDL project, a multi-part standard for XML schema languages. The final parts of this jigsaw are falling into place: at this meeting we agreed to ballot DSDL Part 11, a specification being co-developed with W3C for Schema Association. Jirka Kosek – a project editor – is the driving force behind this work in WG 1. Rick Jelliffe has also completed a draft of a revision to DSDL Part 3 (more commonly known as Schematron), which will now be sent to ballot to collect National Body feedback.

ISO Zip

The need for an "ISO Zip" standard has long been recognized; I remember this topic first being raised in SC 34 during four years ago at our Seoul meeting. No progress has been made since then, but there have been some problems caused by the lack of a standard — not least the sudden disappearance of a document that ODF relies on!

As the number of XML-in-Zip formats mushrooms, WG 1 has now decided to grasp this nettle and propose a project for "Document Packaging" which will aim to deliver a minimal (yet compatible) file format specification particularly suitable for XML-in-Zip document formats. If successful the new standard will be usable as a drop-in replacement for the currently non-standard references to an over-featured (for their purposes) ZIP specification used by such formats as OOXML, ODF and EPUB.

The New Work Item Proposal was presented to the SC 34 plenary where it was agreed (without dissent) that it should be balloted. National Bodies will comment over the next three months and their responses considered at the Tokyo meetings in September.

An unpleasant surprise

Over the last few years I have been editing a project for Extensible Datatypes (ISO/IEC 19757-5, currently a FCD). As is usual with experts in SC 34, the text is prepared using schema-governed XML and processed into XSL-FO for rendering to PDF according to ISO's layout specification — we have a specification, TR 9573-11 AMD, specifically for that purpose.

This work on my text is nearly done, so imagine my surprise to learn that, all of a sudden, ITTF is now only accepting work that is in "Word format" (on inquiry, this means Microsoft® Word™ 2007). The decision has caused dismay among many SC 34 experts and reeks more of the short-term commercial interests of NBs' commercial publishing wings, than of any concern for document quality, adaptability or long-term preservation. It is a shame that ownership of Windows and MS Office is now apparently a prerequisite for being an JTC 1 Project Editor, and I can imagine more than a few eyebrows being raised if any future International Standard version of ODF needs to prepared using Microsoft Word!

WG 4

WG 4 (OOXML - ISO/IEC 29500 - maintenance) met for two and a half days. There are so many strands of activity here deserving of comment that I will write a separate blog post on this later this week. Stand by!

WG 6

WG 6 (ODF - ISO/IEC 26300 - maintenance) met on Thursday afternoon and Friday morning and was well-attended. The main work of this group is production of an International Standard version of ODF 1.1 that rolls in errata to date, and excellent progress is being made on this. Special praise must go to Dennis Hamilton for pulling an all-nighter (remotely participating from Seattle) and addressing some of the gnarlier problems of text production!

In general, it is good to see the suspicions of the past years now firmly set aside and all participants pulling together in the right direction for the good of ODF. It is also especially good to see the process now working better (if not perfectly) and admitting an International dimension to ODF maintenance; this success is due in no small part to the diligence and diplomacy of the WG 6 Convenor, Francis Cave who, it was jokingly suggested at the Plenary, should be appointed for a term of 30 years as convenor, rather than the normal three!

St Francis
Francis Cave, WG 6 convenor

The CJK lobby

As is usual at these meetings, various kinds of lobbying for various kinds of thing were taking place. Perhaps the prize for most effective operators must go to the CJK (China, Japan, Korea) participants who are working hard to raise awareness of their requirements for page layout and typography. Japan is promoting a project for specifying support for KIHONHANMEN in OOXML and ODF extensions, and plans are being made for a 'Workshop on CJK Issues related to OOXML and ODF extensions' in May (details will appear on the SC 34 web site when available). I wish them well ...

 

CJK
Experts from China, Japan and Korea in discussion

Rolleiflexes

Rolleiflexes

My father-in-law, possibly amused by watching me dick around with a DSLR and laptop over the weekend, decided to dig his camera equipment out of storage

These are the cameras he used, over three decades, to take the pictures for his magnum opus (so becoming the first non-Russian to be awarded the Russian Academy of Fine Arts’ gold medal). He asserted he'd always been pleased with Rolleiflex ...

The episode has a useful pay-off ... it established with my wife a new baseline for the number of cameras it is reasonable for a man to own :-)

Document Format Standards and Patents

This post is part of an ongoing series. It expands on item 9 of Reforming Standardisation in JTC 1.

Background

Historically, patents have been a fraught topic with an uneasy co-existence with standards. Perhaps (within JTC 1) one of the most notorious recent examples surrounded the JPEG Standard and, in part prompted by such problems there are certainly many people of good will wanting better management of IP in standards. Judging by some recent development in document format standardisation, it seems probable that this will be the area where progress can next be made …

Most recently, the Fast Track standardisation of ISO/IEC 29500 in 2007/8 saw much interest in the IPR regime surrounding that text, with much dark suspicion surrounding Microsoft's motives. However, the big development in this space – when it came – was from an unexpected direction …

The i4i Patent

Back in the SGML days I remember touring the floor of trade shows and noticing the S4-Desktop product from Candian company Infrastructures for Information, Inc (i4i). Like a number of other products at the time (including Microsoft's own long-forgotten SGML Author for Word, or Interleaf’s BladeRunner) it attempted to make Word™ a structure-aware authoring environment, based on the (accurate) belief that while many companies wanted structured data they didn't want to have to grapple with pointy brackets.

Keen to avoid the phenomenon that Rob Weir describes whereby

There is perhaps no occasion where one can observe such profound ignorance, coupled with reckless profligacy, as when a software patent is discussed on the web.

I will avoid any punditry about the ongoing legal course of this patent. Those interested would do well to read IP lawyer Andy Updegrove's post (and follow-up) on the legalities of this matter.

On the technical merit of the standard though, there appears to me to be unanimity among disinterested experts qualified to judge. For example Jim Mason (for 22 years the chair of the ISO committee responsible for all-things-markup) commented:

[T]his technique did not originate with i4i. It was already established in other commercial products and was, in effect, standardized in ISO/IEC 8613, Office Document Architecture. ODA essentially described a binary format for word-processor document representation, which worked by pointers into a byte stream. Its original interchange format, ODIF, started as a representation of that structure, but it was extended to have an alternative SGML stream, exported by a process similar to that described in the i4i patent. So there was prior art, specifically prior art described in public standards.

This point was expanded on by markup veteran Rick Jelliffe, who concluded:

By the end of the judgment I was left thinking "what interactive XML system with any links wouldn't be included in this?" which is utterly ridiculous.

I was creating SGML systems from 1989, and the i4i patent is just as obvious then as it is now.

In a Guardian Interview i4i chairman Loudon Owen seemed to make it clear that the patent would not be licensed on a reasonable and non-discriminatory (RAND) basis (at least – or especially – where Microsoft are concerned):

On licensing to Microsoft, Owen sounds on the edge of anger: "No. No. This is our property. We are going to build our business. There's no right for Microsoft to use it and go forward." But i4i could license it at some humungous, eye-watering price that Microsoft might have to pay, surely? No, says Owen.

The Wider Context

As part of its amicus brief (PDF) in the Bilski case pending before the Supreme Court, IBM offered what might be termed the orthodox pro-patent position. In a section headed “Software Patent Protection Provides Significant Economic, Technological, and Societal Benefits” we thus find a footnote quoting this text:

Given the reality that software source code is human readable, and object code can be reverse engineered, it is difficult for software developers to resort to secrecy. Thus, without patent protection, the incentives to innovate in the field of software are significantly reduced. Patent protection has promoted the free sharing of source code on a patentee’s terms—which has fueled the explosive growth of open source software development.

While it is somewhat surpising to learn here of the affinity between FOSS and patents, the point is of course that the idea of patents is not wholly without foundation: that a state-sanctioned restraint of trade (for such is a patent) is justified in allowing innovators to monetize their inventions. However, increasingly when we listen to the voices of actual FOSS (and non-FOSS) people the view seems to be that any advantages are outweighed by the problems of patents. For example Mike Kay (developer of the superb Saxon family of XSLT, XQuery, and XML Schema processing products) in an open letter to his MP argues against software patenting in a piece which is well-worth reading in its entirety:

The software business does not need incentives to innovate. If you don't innovate, you die. [...] [I]n the software business, patenting of ideas benefits no-one: certainly, it does not benefit society or the economy at large, which is the only possible justification for governments to interfere with the market and grant one company a monopoly over an idea.

And, in specific reference to the i4i patent:

recently an otherwise unsuccessful company has been awarded a similar [i.e. 9-figure] sum against Microsoft, for an idea which most people in the industry considered completely trivial and obvious.

More colourfully Tim Bray lists some horror-story cases (again well worth reading) and opines that the whole patent system is "too broken to be fixed". He also addresses the question of whether patent activity benefits society, and comes down firmly against:

And here are a few words for the huge community of legal professionals who make their living pursuing patent law: You’re actively damaging society. Look in the mirror and find something better to do.

The Myth of Unencumbered Technology

Given the situation we are evidently in, it is clear that no technology is safe. The brazen claims of corporations, the lack of diligence by the US Patent Office, and the capriciousness of courts means that any technology, at any time, may suddenly become patent encumbered. Technical people - being logical and reasonable - often make the mistake of thinking the system is bound by logic and reason; they assume that because they can see 'obvious' prior art, then it will apply; however as the case of the i4i patent vividly illustrates, this is simply not so.

Turning to document format standards, we can see there most certainly are known and suspected patents in play. For example:

  • the i4i patent mentioned above (which, in his Guardian interview, the i4i Chairman refuses to rule out as applying to ODF)
  • 45 unspecified patents which Microsoft has claimed OpenOffice.org infringes, some number of which may relate to the ODF specification (and which Sun and Microsoft agreed a cease-fire over until 2014 - at least as far as Sun is/was concerned)
  • an unknown number of unspecified patents which have led IBM to include ODF under its Interoperability Specifications Pledge
  • an unknown number of unspecified patents which have led Microsoft to include OOXML under its Open Specification Promise (though presumably clear OOXML-specific patents such as US Patent 7,676,746 are in scope here)

Now, as is clear from the above, large corporations have a preferred means of neutralising their IP stake in standards: by "promises", "covenants" and the like.

The question for standardizers remains: is the current situation acceptable? and if not, what can be done to improve it?

The ISO Rules (and Are They Followed?)

Since 2007 the "big three" International SDOs (ISO, IEC and ITU-T) have operated a common patent policy predicated on the wholly reasonable premise that standards should be "accessible to everybody without undue constraints". The policy is implemented in detail by JTC 1 (which joins the forces of ISO and IEC) and which – as we know – governs the International Standards ODF and OOXML.

The Policy as implemented in the Directives has several aspects, which I would categorise as falling under the following headings …

Personal Disclosure

Anybody aware of an IPR issue has a duty to speak out:

any party participating in the work of the Organizations should, from the outset, draw their [sic] attention to any known patent or to any known pending patent application, either their own or of other organizations. (ISO Directives Part 1, Clause 3)

And indeed committee secretaries and chairs are routinely reminded by Geneva to issue a request for IPR disclosure at meetings, to jog people's memory.

Formal Disclosure in Standards

Readers of Standards can expect to have the IPR/patent situation made explicit in the text before them, and accordingly there are may textual items mandated for Standards to which patents apply. In particular it is stated, "[a] published document for which patent rights have been identified during the preparation thereof, shall include the following notice in the introduction:"

The International Organization for Standardization (ISO) [and/or] International Electrotechnical Commission (IEC) draws attention to the fact that it is claimed that compliance with this document may involve the use of a patent concerning (…subject matter…) given in (…subclause…).

Centralised Record-keeping

A JTC 1 "patent database" (served as a huge HTML document) is maintained in Geneva which gathers together all the patents applying to published standards, and the terms under which patent holders have agreed to make licenses available.

Clear Access Rights

Patent Holders who have signed the licensing declaration to ISO, IEC or ITU-T agree to license their patents under a clear regime: either RAND, ZRAND (i.e. RAND with a free-of-charge license), or – exceptionally – on a per-case commercial basis. Anybody accessing the patent database is able to see this and, by referring to the ISO/IEC governing documents, know what it means, not least because no deviations from Geneva's wording are permitted:

the patent holder has to provide a written statement to be filed at ITU-TSB, ITU-BR or the offices of the CEOs of ISO or IEC, respectively, using the appropriate "Patent Statement and Licensing Declaration" Form. This statement must not include additional provisions, conditions, or any other exclusion clauses in excess of what is provided for each case in the corresponding boxes of the form.

Problem Handling

And if things go wrong:

2.14.3 Should it be revealed after publication of a document that licences under patent rights, which appear to cover items included in the document, cannot be obtained under reasonable and non-discriminatory terms and conditions, the document shall be referred back to the relevant committee for further consideration.

Unfortunately, when we hold up the big two document standards of ODF and OOXML against the goals set out, we see there is work still to be done …

Moving Forward

While the "broken stack" of patents is beyond repair by any single standards body, at the very least the correct application of the rules can make the situation for users of document format standards more transparent and certain. In the interests of making progess in this direction, it seems a number of points need addressing now.

  • Users should be aware that the various covenants and promises being pointed-to by the US vendors need not be relevant to them as regards standards use. Done properly, International Standardization can give a clearer and stronger guarantee of license availability – without the caveats, interpretable points and exit strategies these vendors' documents invariably have.
  • In particular it should be of concern to NBs that there is no entry in JTC 1's patent database for OOXML (there is for DIS 29500, its precursor text, a ZRAND promise from Microsoft); there is no entry whatsoever for ODF. I would expect there to be declarations from the big US vendors who profess patent interests in these standards, and I would expect this to be addressed as a matter of urgency (perhaps in parallel with the publication of these standards' forthcoming amendments)
  • In the case of the i4i patent, one implementer has already commented that implementing CustomXML in its entirety may run the risk of infringement (and this is probably, after all, why Microsoft patched Word in the field to remove some aspects of its CustomXML support). OOXML needs to be referred back to its committee (this may be JTC 1, not SC 34) for a decision on what happens next. My personal guess is that CustomXML will be left in OOXML Transitional (patent-encumbrance will be just one more of the many warning stickers on this best-avoided variant), and modified in, or removed from, OOXML Strict
  • When declaring their patents to JTC 1, patent holders are given an option whether to make a general declaration about the patents that apply to a standard, or to make a particular declaration about each and every itemized patent which applies. I believe NBs should be insisting that patent holder enumerate precisely the patents they hold which they claim apply to ODF or OOXML, as this will give greater transparency about what is (or is not) covered and prevent the vague threat ("there may be patents but we're not saying what") which seems to apply at the moment.

There is obviously much to do, and I am hoping that at the forthcoming SC 34 meetings in Stockholm this work can begin. Certainly, anybody reading this blog post now knows there are outstanding IPR issues which we as standardizers have a duty to raise …

Hi-Fi Life (so far)

The Dark Ages

My first awareness of Hi-Fi came from glossy advertisements in the Sunday supplements. The age of the “music centre” had (just) passed and the in-vogue technology was of the “tower” – preferably that lit up like a Jodrell Bank control console, and which contained a record deck, cassette tape player, amp, tuner and the inevitable “graphic equalizer” – all preferably encased in a cabinet with a smoked glass front. I particularly remember the Philips “Black Tulip” system as seeming particularly desirable.


Philips X70 Black Tulip 3e Set
Philip's Black Tulip Hi-Fi
(image used with kind permission of Vintage Collection)

Gaslight

Soon, like any self-respecting teenage geek, I learned that in fact in the realm of true Hi-Fi one had separates: a record player, an amp, and speakers. Graphic equalizers were frowned upon, and even tone controls were seen as a bit iffy. Received wisdom was that the most important thing was the source (the record player) – on the not-unreasonable premise that if what was being extracted from the vinyl was no good, it could not be rescued downstream. So began my quest for decent sound, ending (largely through purloining things from my father’s system) with a Garrard 401 turntable (which type – amazingly – still seems to command good money on eBay), and SME Series V tonearm (which – even more amazingly – is still available in a newer incarnation starting from £2,050) together with some fiddly moving coil cartridge. Everything sounded okay in its own terms but the trouble came comparing this “Hi-Fi” to live music. I’d go to London and listen to Klaus Tennstedt conduct the London Philharmonic Orchestra in a Mahler symphony, then come home and listen to the same forces perform the same work on record; there was no comparison. Nor was the problem limited to this setup, every other “audiophile” setup I heard exhibited the same kind of problems, which were (as Ken Rockwell recently wrote in typically forthright fashion) essentially caused by LPs:

LPs are awful. Audiophiles are often hoping that I'm endorsing LPs, but no. These "plastic dog plops," as one mastering engineer referred to them, are loaded with noise, wow & flutter, distortion, echoes, room feedback and even pitch changes from never being pressed on-center. LPs usually have their lows cut, or at least summed to mono. Some people prefer the added noise and distortion, much as a veil hides the defects in an ugly woman's face, allowing our brain to fill-in what we want to see. 

Perfect Sound Forever?

Soon after CD was launched commercially as audio format, it became clear which way things were heading and so in 1984 I happily sold off my LP collection and purchased a Philips CD100 top-loading CD player. It was odd recently to take my son to the Science Museum and see one of these units as part of a “how we used to live” exhibit! This CD player, with its signal routed (via potentiometer) into a home-made power amplifier (J Linsday Hood design like this) and feeding a pair of cheap-but-good KEF Coda III speakers clearly offered a step change to anything I’d heard from LP-based systems, although amazingly some people claimed at the time – and still claim – LP could complete with digital sound; either they had something very wrong with their systems or else never listened to live music!

This kind of combination (with occasional modest improvements from upgrading the player or the speakers) was my “Hi-Fi” for the next 25 years – although one significant improvement came in year 21 from the addition of a subwoofer (in this case an MJ Acoustics Pro 50 Mk II). It takes a while to get the levels set right, but a properly integrated sub-woofer not only fills in the all important bottom octaves, but seems to make the whole sound (for example in orchestral music) more realistically airy.


In the Power Amp
Inside a home-brew power amp

But how “hi” was the “fi” of this system? Although pretty good it always seemed there were a few things not quite right…

  • Character: I tend to classify loudspeakers as “happy” or “sad” – and the distinction sometimes only becomes apparent after long acquaintance. In general I prefer (happy) speakers with good presence and a tendency to warmness – speakers with a recessed middle ultimately give less pleasure even if their “sparkling” treble and/or deep bass may have sounded good at first.
  • Volume handling: as speakers get louder (at an orchestral climax for example) they can have a tendency to sound more and more strained – and savour more of “speakers” than of the music. This is particularly the case with less-powerful transistor-based amplification – which also tends to impart a certain unpleasant hard character to the sound.
  • Real bass: real bass (as experienced in the concert hall) is not warm, fuzzy and indistinct, but has a musical and visceral quality far removed from the inchoate whumping noises emitted by low-quality sub-woofers.
  • Detail: more is not necessarily good (as sometimes low-fi reproduction can accentuate things a high-fidelity set up won’t), but presentation of different instruments in the overall blend such that one can concentrate on something if one wants to. For stereo recordings, a good strong stereo image helps here as the instrument will be coherently positioned at a location in the sound stage.

Audio File

The problem of too many CDs seemed like a good opportunity for a re-think; if these could all be ripped to a server then streamed, might this also be an opportunity to upgrade the “Hi-Fi” also?

Looking around, it seemed the answer might be yes. In particular, a British firm caught my attention: AVI Hi-Fi and its ADM 9.1 active loudspeakers. AVI’s robust promotional material seemed congenial (its lambasting of cable fetishism is worth a read), and its premises made engineering sense, in particular that:

  • The digital source does not matter – a cheapo off-brand CD player will extract the same bits from a CD as the most exotic “transport” (funny how this turns on its head the old wisdom of the source mattering most)
  • The DAC does matter, hence the best that money can buy should be used
  • Even given a good speaker design, there are advantages to be had from doing away with the need for a passive crossover (as the AVI site has it: “this is fact not hype” !).

The ADM 9.1’s are thus an almost-complete “system” needing only to be fed a signal (preferably bits via their optical input, to take advantage of the high-quality inbuilt DAC). Each unit is a ported two-driver active speaker; each requires mains power; and each contains two amplifiers – 75 watts for the tweeter and 250 watts for the woofer. The master speaker unit accepts line-level input and/or up to two optical inputs via TOSLINK. The connection to the slave unit is via RCA phono cable, and there is an additional line-level output to drive a sub-woofer (I continue to use my MJ Acoustics unit, but AVI also make their own matched sub-woofer for rather more money). The setup is controlled by a simple remote handset which allows for source selection and volume control.

And how do they sound? Well, after a few weeks making sure to eliminate any false first impressions I find them superb. All the traditional problem areas of hi fi reproduction have been addressed, in particular:

  • There is no apparent false warmth; the lower midrange and bass are tight and precise (and even better with a sub-woofer). But “lack of warmth” does not imply these are cool speakers, more that they are neutral … if you’ve ever listened to high-end headphones you’ll recognise the free, uncoloured type of sound. I’ve read some reports these speakers are too clinical. I don’t agree: it’s just fidelity.
  • Volume: these babies go loud, and do so without losing the plot. A nice side-effect of this is that even at volumes approaching “realistic levels” they are not unduly fatiguing (I mean “realistic levels” for classical music!).
  • Detail: the stereo image is particularly solid, and maybe this helps the very strong sense of detail – so for example when listening to a Beethoven symphony the subsidiary background motoric rhythms in the strings are “there” if you care to listen to them.

Most surprisingly, I was expecting the ADM’s to be a merciless revealing lens through which to view the problems of early recordings, but – on the contrary – good quality analogue material sounds better than ever – something like Peter Maag’s legendary 1960 recording of Mendelssohn's Scottish Symphony (for example) sounds gorgeous. I wonder what the reason for this is.

Are they perfect? No, not quite – listening to the human voice one is sometimes aware of a slight cabinet-i-ness to the sound. But this is a picky caveat – what I’m hearing now is the closest to real I’ve ever heard from my own or other systems.


"Perfect Sound Forever" ?
An ADM 9.1 unit with a CD and Logitech Squeezebox. Karajan's Eine Alpensinfonie (pictured)
was the first commercial CD pressed; the rather nasty recording quality of this,
and may other early CDs, became associated with digital recording itself and set
the cause of digital audio back among audiophiles.

The end of the CD

To drive the ADM 9.1’s I use a Logitech Squeezebox. The long process of ripping my CD collection is underway but it is increasingly clear music in future is going to come straight off the net. I’ve just bought the “studio master” 24-bit FLACs of a thrilling new set of Mozart symphonies from Sir Charles Mackerras and the Scottish Chamber Orchestra. From a server upstairs the bits are sent over a wireless network to the Squeezebox and then down an optical cable straight into the ADM 9.1’s. I’m not sure my ears are able to detect the advantage of 24-bit over 16-bit material, but the end result is absolutely riveting musical reproduction. Maybe this kind of system will form the template for my next 25 years of Hi-Fi life …

[Disclaimer: I have no connection with any of the companies or products mentioned in this posting!]

Update — February 2012

I am still as impressed with these speakers as when I wrote this post originally. I have had a query about how well the MJ Acoustics subwoofer works: again, very well. The one tricky thing is getting it properly integrated with the main speakers. My (not very technical) method for doing this is to use a recording with a very well-recorded bass drum, and then adjust the settings until it sits properly in the orchestral texture. My current disc of choice for this exercise is the excellent recording of Mahler's Symphony No 1 by the Pittsburg Symphony Orchestra conducted by Manfred Honeck. The settings achieved by doing this, at least in my listening room, can be seen in the photograph below.

Subwoofer settings