Mastodon
Where is there an end of it? | All posts by alex

Canon S90

New Toy

 

I like gadgets, and here’s a new one. I’m not expecting it to replace my DSLR, but it has a couple of big advantages over it – it weighs practically nothing and fits in a shirt pocket.

My former Canon point and shoot, an IXUS 800 IS (called the SD 700 in the USA) has now pretty much given up the ghost: the accumulation of pocket lint and grit in the mechanics of the lens extension mean it frequently stops working with a “lens error”, and after dropping it and cracking open the casing, a liberal wrapping of duct tape has been necessary to stop light leaks. It was the fragility of a mechanical lens extension which made me swear I’d never buy another camera like it, and yet Canon have tempted me with this model, which – crucially – has a sensible blend of sensor size, pixel density (it's a 10 megapixel camera) and lens speed which means it will produce very nice shots in good light, and still be able to perform reasonably in less bright conditions.

I’ll try hard not to drop this one (in general, I seem to experience one drop per year), as it seems unlikely it would fare well. It’s very small – which is okay as I have small hands and am used to holding point-and-shoot cameras in a dinky way after experiencing the IXUS 800 – and the buttons feel rather cheap and cheerful, especially the very free-spinning rear dial (though curiously, the function dial on the top of the camera is quite meaty).

One of the most interesting features of this P&S is its (official) ability to shoot RAW, thus allowing one to sidestep the over-zealous noise reduction that often characterises this type of camera. My favourite RAW converter, DxO Optics Pro, has built in support for the S90 and so I’m up and running with my favourite conversion presets straight away. Interestingly the RAW images exhibit a huge amount of lens distortion – at the wide end it’s a bit like looking through a goldfish bowl – and this is corrected in camera (as it is by DxO Optics Pro) using software … so this camera is very much a fusion of hardware and software in what it produces … but then, thinking about it I suppose that’s true, to some degree, for all digital cameras.

 

Boats of the Fleet Lagoon
Boat on the Fleet Lagoon (taken with Canon S90)

Rating Standards Organizations?

Guide Michelin
Can we rate standards organizations like we rate restaurants?

As the summer holiday season approaches, I follow with interest the appearance of this year’s batch of restaurant guides. Alongside the voluminous old favourites such as the Guides Michelin and Gault Millau, there has been plenty of fuss in the media over the publication of the S. Pelegrino World’s 50 Best Restaurants. Armed with such guides – and especially if cross‑referencing them – it is possible to rove across our European paradise confident of finding dining tables where the cuisine will be of excellent quality.

Perhaps echoing something of the same mood, there has been some talk recently of being able to rank standards development organizations (sdos). Over at the TalkStandards site, there is an ongoing debate about “whether standard setting organizations can be ranked or compared against each other, and if so, what criteria should be used and who could make such an assessment.” It is an intriguing idea – could such work enable us to adopt standards from certain top‑rated organizations with the same degree of confidence as we order from the menu at Pierre Gagnaire?

I think that would be difficult. Here’s why …

SDOs are observably inconsistent

All

It’s the people, stupid!

A

Who sits in judgement

Of course

The curious case of uoml

UOML

Nikon Colour Character

On Cromer Pier
Classic Nikon D300 territory: soft light, cool palette

On a recent trip to Norfolk, I forgot to take a camera battery charger and so after my trusty Nikon D300 ran down I was forced to resort to my vintage (well, 2005) Nikon D50 as backup.

Now, I’ve had the D300 for over a year and still feel I’m coming to terms with it. It’s somehow a serious camera and it turns out serious‑looking images; in particular, when it comes to colour it’s at its best with cooler, softer, almost pastel‑like scenes – which it renders with a painterly subtlety. When it comes to attempting warmth, it seems to me to to veer off course and crash into a citrus palette. As I've written before, it’s possible to do something about this. Yet somehow the original character of the sensor seems to come through.

The D50, in contrast, seems to produce punchier, vidid (and not‑so‑serious images). What you see below is straight out of the camera converted from RAW to JPEG using DxO Optics Pro and DxO user Andy_F’s default settings (check out the forums to find Andy’s excellent work on better colour accuracy from RAW conversions). And so if I’m not imagining all this, in some ways (whisper it) I might even prefer the look I can get from the earlier camera …


Colman's Mustard Traffic
Bam! The Nikon D50 delivering obvious, saturated colours

Microsoft Fails the Standards Test

The second anniversary of the approval of ISO/IEC 29500 (aka OOXML) is upon us. The initial version of OOXML (Ecma 376 1st Edition) was rejected by ISO and IEC members in September 2007, and it was only after extensive revisions and a bitter standards war in the following months that a revised format was finally approved on April 2, 2008.

The key breakthrough of the revision process was the splitting of the specification into two variant versions, called “Strict” and “Transitional”. The National Bodies confined all the technologies they found unacceptable to the Transitional format and dictated text to be included in the standard intended to prohibit its further use:

“The intent […] is to enable a transitional period during which existing binary documents being migrated to DIS 29500 can make use of legacy features to preserve their fidelity, while noting that new documents should not use them. […]

This annex is normative for the current edition of the Standard, but not guaranteed to be part of the Standard in future revisions. The intent is to enable the future DIS 29500 maintenance group to choose, at a later date, to remove this set of features from a revised version of DIS 29500.”

I was convinced at the time, and remain convinced today, that the division of OOXML into Strict and Transitional variants was the innovation which allowed the Standard to pass. Enough National Bodies could then vote in good conscience for OOXML knowing that their preferred, Strict, variant would be under their control into the future while the Transitional variant (which – remember – they had effectively rejected in 2007) would remain purely for the purpose of accurately specifying old documents: a useful aim in itself.

Promises and reality

Just before the final votes were counted, Microsoft made some commitments. Mr Chris Capossela (Senior VP, Microsoft Office) wrote an open letter promising what would happen if the Standard passed. Two years on, we can fill out a report card for a couple of these promises and determine how well Microsoft is doing …

Microsoft's promise on standards support in products


“We've listened to the global community and learned a lot, and we are committed to supporting the Open XML specification that is approved by ISO/IEC in our products.”

On this count Microsoft seems set for failure. In its pre-release form Office™ 2010 supports not the approved Strict variant of OOXML, but the very format the global community rejected in September 2007, and subsequently marked as not for use in new documents – the Transitional variant. Microsoft are behaving as if the JTC 1 standardisation process never happened, and using technologies (like VML) in a new product which even the text of the Standard itself describes as “deprecated” and “included […] for legacy reasons only” (see ISO/IEC 29500-1:2008, clause M.5.1).

Knowledgeable experts present at the Ballot Resolution Meeting, knowing what Microsoft planned, have publicly repeated the International consensus position in alarm. XML Standards guru Rick Jelliffe (an Australian delegate at the meeting) wrote:

“If [Microsoft’s] default format is OOXML Transitional, then they have abandoned support for an Open Standards process: OOXML was only made a standard because of the changes that were made at the BRM. The original ECMA version of OOXML (which is the basis of Transitional) was soundly rejected, let no-one forget.”

And Danish expert and BRM delegate Jesper Lund Stocholm, running an analysis of Office 2010 files wrote:

“It has been the fear of many that Microsoft will never, ever care at all about the strict conformance clause of ISO/IEC 29500, and my tests clearly [are] a sign that they were right.”

Microsoft, however, takes a different view to the independent experts. Their representatives will argue (with some justification) that terms like “legacy”, “deprecated”, and “new document” are tricky to define, but then this argument extends to the bizarre assertion that the Strict variant need never be supported. I believe, however, countries expect a more reasonable, plain-dealing approach to their clearly expressed intent – not this kind of wheedling sophistry. Mr Capossela writes that Microsoft has “learned a lot”; but on the evidence before us now, this was wishful thinking.

Microsoft's promise on standards maintenance


“We are committed to the healthy maintenance of the standard once ratification takes place so that it will continue to be useful and relevant to the rapidly growing number of implementers and users around the world.”

It all started so well – defect reports came in from many national bodies and (via Ecma) from Microsoft themselves. A number of useful improvements were made to the text correcting obvious defects, and (in the Transitional variant) fixing some of the evident mismatches between what the Standard said, and what legacy documents actually contained.

But as time has gone on, the situation has deteriorated. At the recent Stockholm meetings corrections agreed at the February 2007 Ballot Resolution Meeting were still being implemented, and while fixes which were evidently required for Office 2010’s headline conformance behaviour have been given the red carpet treatment, some other reports from National Bodies have been left to languish. Unusually, in Stockholm one of SC 34’s working groups (WG 2) recommended to the plenary that the OOXML maintenance group (WG 4) be reminded to answer overdue defect reports – in the ISO world that counts as a diplomatic incident!

Most worrying of all, it appears that Ecma have ceased any proactive attempt to improve the text, leaving just a handful of national experts wrestling with this activity. It seems to me that Microsoft/Ecma believe 95% of the work has been done to ensure the standard is “useful and relevant”. Looking at the text, I reckon it is more like 95% that remains to be done, as it is still lousy with defects.

Ironically, the failure to resource maintenance properly is only going to damage Microsoft Office in the longer term. The simple validators developed by me (Office-o-tron) and by Jesper Lund Stocholm (ISO/IEC 29500 Validator) reveal, to Microsoft's dismay, that the output documents of the Office 2010 Beta are non-conformant, and that this is in large part due to glaring uncorrected problems in the text (e.g. contradictory provisions). It is also a worrying commentary on the standards-savvyness of the Office developers that the first amateur attempts of part-time outsiders find problems with documents which Redmond’s internal QA processes have missed. I confidently predict that fuller validation of Office document is likely to reveal many problems both with those documents, and with the Standard itself, over the coming years.

So – while maintenance is happening, I think calling it “healthy maintenance” would be over-optimistic given the current circumstances.

Someone has blundered?

Microsoft has many enemies who will no doubt see the current state of affairs as proof that Microsoft never even intended to be good standards citizens. Indeed standards and XML veteran Tim Bray, writing shortly after the standard’s approval, made a prediction which could now seem impressively prophetic:

“It’s Kind of Sad • The coverage suggests that future enhancements to 29500, as worked through by a subcommittee of a subcommittee of a standards committee, are actually going to have some influence on Microsoft. Um, maybe there’s an alternate universe in which Redmond-based program managers and developers are interested in the opinions of a subgroup of ISO/IEC JTC 1/SC 34, but this isn’t it.

I suppose they’ll probably show up to the meetings and try to act interested, but it’s going to be a sideline and nobody important will be there. What Microsoft really wanted was that ISO stamp of approval to use as a marketing tool. And just like your mother told you, when they get what they want and have their way with you, they’re probably not gonna call you in the morning.”

For me, the puzzle of it is that in many respects, Microsoft does appear to get it. Senior management seems to want standards conformance, as Mr Capossela’s letter demonstrates – indeed strategically, playing fair by standards has always seemed like the most obvious way for the corporation to extract itself from the regulatory thickets that have entangled it over the past decade. Microsoft employs many eminent and standards-aware people of unimpeachable record – they also obviously “get it”. And on the ground in the standards committees there are many delightful, talented and diligent people who seem fully-signed up to a standards-aware (dare I say “non-evil”?) approach—as the SC 34 meetings in Stockholm again recently evidenced.

And if we look elsewhere within Microsoft we can see – for example from their engagement with HTML 5 and work on MSIE – that they can move in the right direction when the will is there.

So why – given the awareness Microsoft has at the top, at the bottom, and round the edges – does it still manage to behave as it does? Something, perhaps, is wrong at the centre — some kind of corporate dysfunction caused by a failure of executive oversight.

But whether Microsoft senior management have directed the company to behave badly, or whether they have failed to control a bad corporate impulse, is ultimately of no interest or concern to the National Bodies engaged in Standardization: for them, the effect is the same. Some responses will, however, be necessary.

Moving forward

If Microsoft ship Office 2010 to handle only the Transitional variant of ISO/IEC 29500 they should expect to be roundly condemned for breaking faith with the International Standards community. This is not the format “approved by ISO/IEC”, it is the format that was rejected.

However, it is foolish to believe they won’t ship it as is – and before long the world will be faced with responding to that release. In my view moving forward from there will be difficult …

  • Governments, corporations, other large entities – in fact, anyone – procuring office systems with a requirement for standards-conformance need to have their eyes very wide open about what precisely they will be getting with systems which create new documents which are extended Transitional ISO/IEC 29500.
  • Microsoft Office 2007 (the current version) reads and emits unextended Transitional ISO/IEC 29500, and so – strangely – may represent a high-water mark of Microsoft Office standards conformance. Anybody wanting to work just with documents which (modulo defects) are fully specified by Standards wholly under International control may want to stick with this version of the software.
  • Microsoft should make a public open commitment to support OOXML Strict fully. A service pack bringing this support to Office should be developed as a priority.
  • JTC 1 explicitly created the Transitional variant with the intention they would “at a later date, […] remove this set of features”. Now is the time to start the wheels in motion for this removal (the text will of course remain available for the perfectly good reason that the legacy needs to be documented).
  • Any assurances Microsoft has given to regulatory bodies (such as the EU Commission) about standards conformance must be looked at very carefully giving full consideration to the circumstances of this release.
  • Ecma need to commit adequate resources to standards maintenance and pro-actively seek to improve the text, working together with SC 34, if there is any appetite to improve the Standard to the point where it can be a trouble-free, or even good, basis for interoperable office applications.

In short, we find ourselves at a crossroads, and it seems to me that without a change of direction the entire OOXML project is now surely heading for failure.

Defect-o-tron ?

At recent SC 34 standards meetings, I've been thinking it would be useful to have XML models for all the documents we habitually have to deal with: agendas, meeting reports, proposals, defect reports and so on.

One particular focus for me at the moment is defect reports. Currently, I prepare these as XML documents conforming to the following home-brewed schema:

namespace xh = "http://www.w3.org/1999/xhtml"

start = comment-batch
any-xhtml =
  (element xh:* { any-xhtml }
   | attribute * { text }
   | text)+
comment-batch =
  element comment-batch {
    (attribute xml:lang { token }
     & attribute xml:id { token }),
    spec,
    submitter,
    comment+
  }
spec = element spec { text }
submitter =
  element submitter {
    attribute type { token },
    text
  }
comment =
  element comment {
    attribute xml:id { token },
    type,
    document,
    (clause | corrigendum),
    page?,
    problem,
    proposal?
  }
type = element type { "G" | "T" | "E" | "C" }
clause = element clause { text }
corrigendum = element corrigendum { text }
problem = element problem { any-xhtml }
proposal = element proposal { any-xhtml }
document = element document { text }
page = element page { text }

As you can see, it's pretty simple and leverages XHTML for the meat of the content (validation is left as an exercise for the reader). Using this, a comment on a Standard looks like this:

<comment xml:id="GB-AB-25" xmlns:xh="http://www.w3.org/1999/xhtml">
<type>T</type>
<document>ISO/IEC 29500-1:2008</document>
<clause>2.4</clause>
<page>3</page>
<problem>
  <xh:p>The first and last items in the bulleted list are contradictory,
    since a document which uses the extensibility mechanisms of Part 3 cannot
    be valid to the Part 1 (or 4) schema.</xh:p>
  <xh:p>Furthermore, Part 1 need not mention the Part 4 schema here.</xh:p>
</problem>
<proposal>
  <xh:p>Qualify the validity requirement with a caveat that such
    documents are pre-processed with the behaviour specified in Part 3
    (once that behaviour is described normatively).</xh:p>
</proposal>
</comment>

Batches of comments are processing into XHTML documents (with XSLT) for review and correction, and – once approved – can then be forwarded for processing with the normal maintenance mechanisms. In the case of WG 4 (OOXML) there is a web form available for submission — but getting my nice XML into the form’s fields requires much tedious copy and pasting.

I suppose this bit can be automated with a programmatic HTTP client. But thinking about this also made me think the defect generation process can be (partly) automated. There are so many formulaic problems with the OOXML text that it should be possible to write programs which feed errors in the text directly into the maintenance process. Well, it would if it wasn’t for the various bureaucratic intervening processes. Hmmm, defect-o-tron anybody?