Microsoft's OOXML Extensions for Office 2010

By Rick Jelliffe
February 18, 2010 | Comments: 12

Microsoft has pages out now which detail the extensions to OOXML they are using in Office 2010.

OOXML is an extensible format, with a feature called Markup Capability and Extensibility to allow better management of extensions, in particular signalling whether there are alternative media versions available and whether the extension elements can be treated as decorative throwaways or information that an OOXML consumer must utilize.

You can see overviews or details here for:

It would be consistent if the ISO standard for OOXML included this information, at some stage, in some form. MCE allows a kind of nice layering, by time, for evolving standards: I think it is a real mistake in ODF 1.2 not to adopt MCE also, just for future-proofing: it makes older applications more robust in the face of newer format versions as well as vice versa.

Without such a mechanism for practical modularity, standards groups get trapped in the big rut of major releases, even for minor features. It discriminates against niche requirements and goes against agility and innovation. To a great extent, ODF without MCE is a white flag of surrender by the ODF TC: that the basic shape and feature set of office suites will be set by Microsoft, and ODF will re-implement them in some neater way that fits in with the FOSS platforms. Perhaps it is just realistic, but sorry it seems like corporatist thinking.

Why so much emphasis on Backstage?

This section also has some Office 2010 XML formats you wouldn't expect to see in ISO OOXML: for example, the XML formats for making custom user interfaces. I like XUIs: we had great success using SwiXML for Java, a decade ago. Glancing through these formats, the one that caught my eye was the customizations for Backstage, which is the name Microsoft give to the highly enhanced document properties box.

Actually, it seems Backstage is concerned with all aspects of a document that don't relate directly to things you see on the page WYSIWYG, which are taken care of by direct manipulation and the Ribbon interface: what Clay Satterfield calls OUT features.

Satterfield says (of the Office team working on the Office 2007 feature set and concentrating on the IN features):

What we were sorely lacking was the WYSIWYG equivalent for the OUT features.
What made this particularly scary for us internally is that for the foreseeable future, the OUT features are the ones that are growing rapidly. Documents are now rarely simple files authored by one person who keeps it on his hard drive until he prints. Collaboration and sharing are critical. Documents are key parts of complicated business processes. There's a ton of context surrounding documents, and increasingly, that context needs to surface within the authoring application.

My company Topologi's old Collaborative Markup Editor allowed you to check things like who was editing which document, which directories they have edited in recently, to send documents on a whiteboard arrangement to deal with issues, and to chat, based on a peer-to-peer system.

I think it is the way that things will go, eventually. It is not just the document that we need to share or see, it is also work histories, actual window screenshots, notes, and so on. Automatic peer-to-peer systems provide a kind of zero-configuration way to allow this.

So I am quite intrigued by Backstage. I tend to see it as a reversal of the usual sales pitch: instead of "Buy our application: it can add value to your existing backend" it is "Buy our backend: it can add value to your existing office suite."

You might also be interested in:


Rick, name me 3 other standards other than OOXML that use MCE? OK. If that is too hard, then name me 2 other standards. Still too hard? OK. I'll tell you. Except for Microsoft's XPS, there are absolutely ZERO other XML standards that use MCE.

So if this is such a critical piece to schema evolution, then why is it used by ZERO W3C standards, ZERO OASIS standards, and by ZERO other standards in all of JTC1?

So your claim that ODF 1.2 is a "real mistake" and is show a "white flag of surrender" and is not "robust",for doing EXACTLY what 99.9999% of XML standards do, is dubious at best. At the very least the lack of any interest in MCE over the past few years shows that almost no one agrees with your observation. Somehow, even without Microsoft's benevolent and wise hand to show us the way, markup standards have managed to evolve over the years and the sky has not yet fallen (and I just checked).

Remember, MCE is quite unproven. There is almost no implementation experience. Of those apps that implement OOXML today, very few implement the complex and expensive MCE prerocessing steps correctly, not even Office 2007. IMHO, it is over-engineered an makes MCE-hacked markup unprocessable by standard XML tools without adding that expensive and complex preprocessing step.

The real test of whether MCE is useful in practice is when Office 2010 comes out with all its unstandardized extensions. When that meltdown comes, then tell us again how wonderful MCE is.

Ad just for the record, Rick, now that Microsoft appears poised to take another pass at smashing their format (in the Office 2010 guise) through JTC1, are you again on their payroll? If not now, please be sure to tell us when that happens.

Rob: That more standards would be well-advised to use MCE can hardly be countered by saying that few currently do. (I note that you ignore the substantive comment on the concern that the ODF TC is becoming a follower rather than a leader with a personal attack, as usual. In fact, it is almost comical that you should respond to a comment that ODF TC is in danger of being a follower by asking "who else uses MCE?")

I never said ODF 1.2 is a "real mistake". I said that the lack of this particular feature in ODF 1.2 is a real mistake.

The issue of how to handle versioning in XML is a significant one. MCE is not the only standard to deal with it, but it is the only one pulled out as a nice layer for other systems to use, clone or copy.

You may have heard of standards called SOAP (which has mustInclude and mustIgnore), or XLST (which has its xslt:fallback element), or XSD 1.1 (which uses a positive and negative wildcard system: according to they are discussing mustInclude and fallback too, btw).

They all try to address versioning, and all of them came (or started) before MCE. XSLT, SOAP, XSD and OOXML are perhaps the four of the six most important XML languages, the others being HTML and ODF. Now that I think about it, XHTML also has a very strong versioning system: see Modularization of XHTML (I worked on those schemas very early on) but concentrated on allowing profiles: HTML already has ignore/strip semantics for unknown elements in head/body respsectively of course.

So I think ODF is actually the odd man out here, among the major vocabularies.

I wasn't aware that MS didn't implement MCE correctly (do you have a reference for this?), it wouldn't surprise or alarm me, but so what? It doesn't alter the need.

Versioning becomes a serious problem but a lot of people want to bury their heads in the sand: namespace-based MustUnderstand/fallback vocabularies like MCE seem very useful in this regard, but they are things which are most effective if installed before you need them: they futureproof. Installing them when you need them is too late.

(And, even then MCE doesn't address all problems: for example see for a different angle: I tend to think MCE is aimed at 'pass-fail' too much, while the office application developers need graceful degradation more.)

I am not on the payroll of Microsoft. Are you saying that someone on the payroll of a large monopoly is inherently duplicitous?

As I said, there are ZERO other standards that use MCE. Let's see how it works in practice. Those working on ODF have 20+ years practical experience in versioning office document formats. We know what works and what doesn't. I'm not impressed by marketing claims on OOXML/MCE. Let's see how it works in practice.

Rob: Well, I have 20+ years working in markup of complex documents, so I am not impressed, in return.

And, if we are talking about experience in managing the largest number of version of office document formats over the years, MS would definitely be the ones you would expect to have learned a thing or two.

If MCE itself is too galling to use, make up your own, drawing on SOAP, XSLT, XSD etc. But it will just create problems down the track by not having any useful infrastructure in place.

Before saying anything, I'd like to clarify that I AM on the payroll of a large multi-national corporation, and I get paid by that big corporation to do things like attend standards meetings, participate in these sorts of technical debates, and so on. So the opinions I express may be the opinions of the large corporation I work for, or they may be my own personal opinions, or both. In that, Rob, we are brothers.

As for MCE in practice, I just tried something. I opened Word 2010 (pre-release but damn close version) and typed in a sentence, then I applied a reflective glowing text effect to one word in that sentence. That's a new text effect in Word 2010, which some other applications may not support. Then I saved it as a DOCX, where Word 2010 stored that new text effect in an ignorable namespace.

Then I opened it in Swriter 3.2. The document opens fine, and I see all of the text, but without the glowing reflection on that one word. The system seems to be working as intended.

I tried to open this document in my IBM Lotus Symphony 3 Beta 2, but it seems to have stopped working. I get an hourglass for a couple of seconds, then nothing happens, although it was running fine a few days ago. But I'd imagine it works the same there as in 3.2, since they're both running the same code and presumably have the same implementation of OOXML under the hood.

I know of no problems with our MCE implementation, and in particular I know of no way in which it doesn't conform to the normative requirements of IS29500 Part 3. If anyone knows of such an issue, I'd love to hear about it. (Hear about the issue itself, that is, and not a general unsubstantiated claim.)

Actually, ODF 1.0 (the pre-IS:26300 version) had features akin to MCE in section 1.5 (the language governing foreign elements and attributes). But they were unfinished because a key ODF TC member who was working on it changed employers and had to drop out. So compatibility attributes were omitted.

And over the longer haul, it became evident that about five more more elements would be required to achieve round-trippability between and MS Office, which proved futile because OOo stripped all but its own foreign elements and attributes.

At JTC 1, the ODF MCE-like features were effectively gutted by the switch in requirement keyword definitions from RFC 2119 to ISO/IEC Directives Part 2 definitions. That switch dropped two mandatory interoperability requirements from every occurrence of "may" for "optional" in the spec, including those relating to the foreign elements and attributes.

The result was that OOo's destruction of foreign elements and attributes became arguably conformant and the foreign elements and attributes were transformed into merely a vehicle for implementation-specific extensions to ODF that plague ODF interop to this day.

The MCE-like round-tripping methodology is far from experimental. The same techniques have been used in WordPerfect since version 6.0 to round-trip documents without data loss among all versions from 6.0 through X4, despite unforeseen features in the later versions.

I agree that MCE needs improvement. E.g., cascading attributes could provide more graceful degradation and more detailed guidance in application and preservation of the compatibility attributes and their content is also needed.

But the underlying concepts are sound and proven. I suspect that Rob is just expressing his company's normal aversion to actually walking the ODF interop walk. IBM prefers to just talk the ODF interop talk and hurl distracting ad hominem attacks at those who file interop bug reports against the ODF specification, customer requirements and the law bedamned.

@ "Ad just for the record, Rick, now that Microsoft appears poised to take another pass at smashing their format (in the Office 2010 guise) through JTC1, are you again on their payroll? If not now, please be sure to tell us when that happens."

What precisely does that have to do with the merits of what Rick said, Rob?

Paul: Yes, he has to cling to the narrative that the only people who supported an ISO standard OOXML were those paid by Microsoft. So when someone supports it and is not being paid, they must instead be expecting to be paid: amakudari. It doesn't give him much room to be effective in dealing with the people he alienates. I don't see why we should bother ourselves over it. (I have always thought it is highly unlikely that MS would offer me a job, by the way, for the simple reason that they now have people in place with the kinds of skills that I have to offer in standards work. And I don't really want to leave Sydney. I would have been a better fit with Google, because of their gap in standards expertise a couple of years ago, though I would need to get up to speed on my computer science to fit in.)

As far as IBM's record, I don't think it is right to demonize the whole firm based on the excesses of a handful of cage-rattling employees. Most other IBM participation in standards is very civil, congenial and productive in my experience: in one other group I didn't care for the attitude of a Lotus guy, but I wouldn't generalize to that company either. All the big companies blow hot and cold about standards and interop, between periods of innovation and consolidation. What they say one day, they will try to wriggle out of tomorrow. The best we can do is to try to lock them into the cage as much as possible when they are sniffing around in it.

Hi, Rick,

Sorry I left the misimpression that I was demonizing IBM staffers as a group. I did limit my observation to IBM's actions in regard to ODF. I'm well aware that other IBMers have done some outstanding work in regard to interoperability, notably on the W3C Compound Document by Reference Framework, which has a conformance section I think both ODF and OOXML would benefit from in the future, if adapted to encompass "authoring agents" as well as "user agents." (.) It also provides the needed guidance for developing sorely needed profiles.

Neither ODF nor OOXML adequately specify the APIs needed for interoperability of different implementations. OOXML simply has a better start, a budding round-trip compatibility framework that ODF lacks. Both also need profiles and APIs specified for exchange of profiled documents. And CDRF is an already developed framework designed for the purpose, with implementations.

But that's another interop measure that Rob Weir vetoed during the formation meeting of the ODF Interoperability and Conformance TC, along with all other suggestions for a defined work plan to actually achieve the interoperability of ODF implementations.

Indeed, I'm unaware of any I.T. standard other than ODF toward which IBM staffers have exhibited a consistent anti-interoperability agenda. That is why I limited my criticism to the company's role with ODF. But I couldn't limit that criticism to Rob; he's demonstrably had assistance from other IBMers in his anti- ODF interop work.

So my apologies to any IBMers who understood my criticism as being directed toward all IBM staffers as a group. I confine my criticism to those who have played a role in keeping ODF under-specified whilst falsely proclaiming to the world that ODF already has multiple interoperable implementations; e.g.,

"Try putting that into an internal product plan: list of features that let users switch to software from other providers, possibly open source. Have you done that? ... If you provide an ODF-compliant application you have. Providers of software that implement open and non-vendor controlled standards are used to living with the higher level of risk that substitutability engenders. ..."

Bob Sutor, Interoperability and Substitutability (5 September 2006),

By the way, Rick, I agree with your observation that: "The best we can do is to try to lock [the big companies] into the cage as much as possible when they are sniffing around in it." But I prefer your more colorful description when you suggested shoving real standards up their backsides. :-)

Paul: My impression, which may be entirely wrong, is that Rob tends to think that at a certain point plugfests (moderated mass vendor tests) are more effective for getting interoperability than the wording of standards.

I think it is an entirely reasonable view, especially where standards work and plugfests compete for scarce resources within an organization.
When you only have one or two implementations of a standard, the coordination cost of a plugfest are less than the coordination cost of International Standards.

And I would go further, that sometimes it is better to underspecify a standard, let the chips fall where they lie, and then quickly finish off the standard to reflect the reality, for better or worse.

(The HTML5 editor Ian Hickson has been very vocal about that standards for deployed software should follow the reality. Again, I think it is reasonable, but probably wrong.)

However, the gotcha is that the plugfest/reality justification can easily just become a power grab: large corporations effectively trying to minimize the influence of other stakeholders.

I don't see so much harm in marketing people proclaiming all sorts of wondrous benefits of their products: that's what they do. Grown ups have an endless supply of grains of salt. A vision and optimism is useful. If the spec and its implementations don't make the grade, then they fail either during acceptance testing or early on in deployment.

I obviously think that conformance is really important. And objectively verifiable conformance. But I think that many of our XML tools don't quite test the things that need to be tested for conformance: this goes beyond MCE or Schematron etc.

The way things work is that it will take a few years for alternative strategies to get a mindhold. The incoming generation of technical people coming into standards work will have been weened on the milk of test-driven development, conformance will be their worldview as much as functionality. (Well that is my hope anyway.)

Hi again, Rick.

I apologize in advance for the length, but an important point you raised requires a fairly elaborate response.

I've got no per se problem with plugfests in conjunction with work on a standard. But problems arise when the lessons learned aren't funneled back into improvement of the standard, particularly a standard aimed at global implementation by a multitude of developers.

The problem in that context is that non-attendees don't obtain the benefits of the lessons learned. For example, someone who can't go globe-hopping to attend the plugfests or someone who wants to enter the market later. The ODF Plugfest participants have attempted to work around that problem by maintaining a wiki summarizing their lessons learned, but in my opinion the wiki is a legally inadequate solution to the problem.

One problem is that competitor collaborations are heavily regulated. Antitrust enforcement officials are deeply suspicious of competitor collaborations because of their anti-competitive potential. See e.g., Competition Commissioner Neelie Kroes, Being About Open Standards, Europa Rapid Press Releases (10 June 2008), (text of speech)("Allowing companies to sit around a table and agree technical developments for their industry is not something that the competition rules would usually allow. So when it is allowed we have to look carefully at how it is done.")

Such issues were on the European Commission's table when it was required to render a decision on proposed activities of the X/Open Group. 87/69/EEC: Commission Decision of 15 December 1986 relating to a proceeding under Article 85 of the EEC Treaty (IV/31.458 - X/Open Group). There, a group of large vendors sought approval for a collaborative effort to develop Unix API standards.

(32) The definitions which the group adopts are made publicly available. In this respect the definitions constitute an open industry standard. However, non-members as opposed to members cannot influence the results of the work of the group and do not get the know-how and technical understanding relating to these results which the members are likely to acquire. Moreover, non-members cannot implement the standard before it has been made publicly available whereas the members are in a position prior to implement the interfaces which the Group defines because of earlier knowledge of the final definitions and, possibly, of the direction in which the work is going. In an industry where lead time can be a factor of considerable importance, membership of the group may thus confer an appreciable competitive advantage on the members vis-à-vis their hardware and software competitors. Considering the wider importance which is likely to be attached to the standard, this advantage in lead time directly affects the market entry possibilities of non-members. The advantage in question is different in nature from the competitive advantage which the participants in a research and development project naturally hope to get over their competitors by offering a new product on the market; they hope that their new product will result in a demand from users but their competitors are not prevented from developing a competing product whereas in the present case non-members wanting to implement the standard cannot do so before the standard becomes publicly available and, therefore, are placed in a situation of dependence as to the members' definitions and the publication thereof.


(35) In the circumstances of the case, an appreciable distortion of competition within the meaning of Article 85 (1) may result from future decisions of the Group on interfaces in combination with decisions on admission of new members to the Group.

The Commission gave the X/Org Group a go-ahead nonetheless, but only upon added conditions with ongoing oversight, largely because of assurances that the project would publish its standard within a set, very short period of time and of the net benefit to the public of having standardized Unix APIs. The governing substantive law is in substance the same in the U.S. (although procedures vary) and likely in other nations that have patterned their relevant competition laws on the U.S. or E.U. models.

I discuss that case because it is the E.C. decision most closely resembling the facts involved with the ODF Plugfests. But I will stress that its fundamental legal principles have been repeatedly reaffirmed and are even more tightly constrained today.

The X/Open Group decision's fact pattern is not on all fours with the ODF Plugfests. For example, attendance at the plugfests is not limited to specific companies. But the ODF Plugfests are not developing a standard so their competing participants are immediately suspect. Furthermore, they embody none of the required due process requirements, such as consensus decision-making and established procedures for resolving antitrust issues. Their wiki does not move them within the shelter of law that allows competitors to collaborate in developing an open industry standard.

Moreover, the ODF Plugfests are not structured to achieve timely integration of lessons learned into the ODF specification. The connection of the Plugfests with the ODF standard are tenuous at best. The Plugfests are conducted under the auspices of the ODF Interoperability and Conformance TC, which has no authority to make revisions to the standard and is therefore hardly in the position to commit to speedy integration of the lessons learned into the ODF specification. That authority rests only with the OASIS Office TC ("ODF TC"), subject to a final ballot by the OASIS membership. This is one of the major reasons why I pushed to have what became the OIC TC established as an ODF TC subcommittee instead.

I've also seen no indication that the ODF Plugfest recently held at The Hague honored the E.U. competition law requirement that an advance ruling on the legality of such a collaboration among competitors be obtained from D.G. Competition, as was done by the X/Open Group.

Indeed, the big vendors on the ODF TC have steadfastly blocked pressure to specify the conformity requirements that are essential to achieve interoperability since the TC's establishment in 2002. In recent years, Rob Weir has been a prime mover in deflecting all such efforts. As only one example of a multitude of such instances, I suggested on the Office TC comment list as follows:

A parliamentary procedure suggestion:

If a formal proposal is made to produce one core conformance class or profile in ODF 1.2 that fully complies with the JTC 1 Directives requirement of specifying "clearly and unambiguously the conformity requirements essential to achieve the interoperability" and there is an up or down vote on whether to do it, the naysayers will be fairly shrieking for pressure from customers and government competition regulators.

Rob replied:

"As a practical matter, if I phrased a question in that way, most members would likely abstain. Since JTC1 Directives are far from unambiguous in this and other areas, and the topic clearly has policy and legal implications that is out of depth for the average technical contributor to the TC, including myself, a large percentage of abstentions and a lack of decision would be the natural outcome. We need to break it down and swallow the elephant 'one bite at a time'."

(Both quotes from Never mind that Rob certainly has IBM lawyers to consult, that the JTC Directives were drafted to be used by those who develop standards, and that a JTC 1 standard is required to have those interoperability conformance requirements before it is ever adopted as an international standard, absent the express consent of the ISO and IEC CEOs. Nonetheless, Rob ducks even allowing a vote on whether to adopt compliance with the JTC 1 Directives interoperability requirements as an ODF TC goal.

One might attempt to comprehend Rob's excuses for going the plugfest route rather than repairing the specification itself despite his admission that "interoperability is most efficiently achieved by conformance to an open standard where the standard clearly states those requirements which must be met to achieve interoperability." Rob Weir, A Follow-up on Excel 2007 SP2's ODF Support, An Antic Disposition (7 May 2009).

But when I attempted to do so, I quickly found myself mired in nonsense such as Weir's pressing for to be designated as the ODF reference implementation. See e.g., Rob Weir, ODF Interoperability --- The Price of Success, presentation slide 22 ( Conference in Barcelona, Spain, 19 September 2007). ("Let's work to make be the full reference implementation for ODF!")

This despite Rob's admission at about 44 minutes into his presentation that "ISO doesn't have the concept of a reference implementation." Ibid., video record.

And of course one of the ways one might hope to achieve de facto recognition of as the ODF reference implementation is to keep the ODF specification dark and mysterious. But this appearance of an IBM drive to keep the codebase they use at the center of the ODF universe via manipulation of the specification — an appearance bolstered by Rob's pummeling on Microsoft for having exercised some of the considerable discretion granted by the specification to use Microsoft's own spreadsheet formula markup — itself raises a host of related competition law issues. De jure industry standards are required to be vendor-neutral, leveling the competitive playing field. But leveling the ODF competitive playing field has not yet found its way onto IBM's agenda.

While application-level interoperability work can in some circumstances make sense from a technical standpoint, neglect of a standard's specification is on the main road to illegality and frustration of market requirements.

For example, the Agreement on Technical Barriers to Trade repeatedly prohibits technical standards or conformity assessment procedures from creating "unnecessary obstacles to international trade." That is but a restatement of antitrust law in the E.U. and the U.S. applicable to industry standard development organizations.

Moreover, in my opinion the JTC 1 Directives'requirement that standards "clearly and unambiguously specify the conformity requirements that are essential to achieve the interoperability" does no more than translate the same TBT Agreement prohibition into the I.T. context. The only difference worthy of mention in context is that the same principles are extended by the TBT Agreement to Member nations and their national standardization bodies.

In summary, governing law expects fully-specified standards as the price tag of application-level interoperability collaborative work among competitors. I would have no issues with the ODF Plugfests were the ODF TC actively working to remove the gross under-specification problems, were the plugfests conducted by the ODF TC, were necessary legal permissions obtained, and were adequate procedures established to quickly roll the lessons learned back into the specification.

But I have seen no evidence that anything but the opposite is true. The ODF Plugfests seem aimed at delaying repair of the specification and keeping the code base IBM uses at the center of the ODF universe, whilst creating the public misperception that someone is actually working on ODF specification's profound interoperability issues. The glaring absence of such work in my carefully considered opinion strips the ODF Plugfests of all plausible legitimacy.

The Plugfest participants appear to be just competitors sitting around a table agreeing on the future of the industry, as Commissioner Kroes put it, ignoring relevant legal constraints and users' interoperability requirements. Most participants probably aren't aware of the legal constraints, but ignorance of the law isn't an available defense in the legal context under discussion. Nederlandse Federatieve Vereniging voor de Groothandel op Elektrotechnisch Gebied and Technische Unie BV v. Commission, (joined cases T-5/00 and T-6/00), Judgment of the Court of First Instance (First Chamber) (16 December 2003), para. 10 ("It is not necessary for an undertaking to have been aware that it was infringing the competition rules laid down in the Treaty ...").

Switching topics, Bob Sutor was the IBM Vice President in charge of standardization activities at the time of the quoted publication, not a marketing guy.

Paul: The nub of your comments is that ODF's conformance (and process) has not been meeting the minimum required to meet community expectations, including upcoming legal expectations, and that this is largely the result of corporate machinations. It is not far-fetched or implausible to me, but the fat lady has not sung yet: lets see what the OASIS review process and any subsequent ISO review flushes out. I am Mr Glass Half-Full.

As long as ODF 1.2 has some substantial advances over ODF 1.0 in regard to conformance (generally and for particular issues), I am sure it will be accepted by JTC1 or SC34: the direction and the willingness to do maintenance is important.

Any conception of standards that relies either on their technical perfection or their necessary excellence is bogus: even stabilized standards. About 17 years ago, I came to the conclusion that a standard is a community (or a fragment of a community's discourse) and a process as much as a text. That is not to say that the text and committee processes must not be as tight as possible, of course.

On the X/Open case, I think it can be distinguished from the ODF plugfests because

1) The ODF plugfest is held under the auspices of a national government body (the Dutch NOIV). It is unthinkable that a government agency cannot arrange interoperability tests. I believe X/Open was strictly a vendor affair.

2) The ODF plugfest does not produce standards or formal results, as I understand it. X/Open was to produce some standards.

3) That standards need to be proved by implementations is hardly odd (especially protocols and exchange standards.) A Plugfest not only helps debug software, it helps debug a standard. This is very valuable. You could even see a Plugfest as a way of validating or objectively drawing out the conformance requirements needed in a standard: proving that some text is ambiguous or incomplete for example.

4) You can see Plugfests as an ad hoc requirements gathering and prototyping phase for the test suite for a standard. (Actually, IIRC ISO does not really do test suites: it tends to want to just set the abstract qualities that a test suite for a standard would have, and leave the details to organizations closer to the field.) This is certainly not contradictory to formal committee work.

But I do agree that it would have been better for the results of the PlugFest to have been made public, even if just as screenshots rather than a scorecard. Looking at the composition and affiliations of standards committees is fair game: I tend to look from the cartelization POV but it is not the only one.

And I do agree that Plugfests need to be subsidiary to formal committee work, as far as a standard goes.

I think people can reasonably differ in their expectations of how practical are the kind of objective tests I push for: for example, IMHO I think ODF and OOXML should both be specified functionally, i.e., in relation to an objectively testable 2D page interface, so that terms like left and right, top and bottom, before and after, margin and Z-order are susceptible to automated testing rather than just visual checks by humans. It could be done with an instrumented PDF or XPS, for example. But I am aware that this goes well beyond the expectations and current texts of ODF and OOXML, and may intrude into implementation issues superficially. So it is a reasonable view (though not mine) that this kind of major alteration is not feasible for the required timeframe, the perfect being the enemy of the good.

As for conformance requirements, it is possible for a standard to have conformance requirements that are both explicit and loose. (In the same way that a fridge is a conforming "XML Infoset", notoriously.) This needs to be called out in the Scope statements: "This is a loose standard" for example.

This kind of thing is one reason why, the more important a standard is, the more it needs to be reviewed externally. This is a role that ISO should be performing (but largely failed to for the ODF 1.0 PAS which rather bypassed normal scrutiny.)

When a standard gets too late, it is prudent for convenors to try hard to limit the scope of what gets discussed, even though this is unsatisfactory to all concerned. The system needs to have a check-and-balance for this tendency, i.e. formal reviews which can send the draft back if it does not meet some bottom line.

B.t.w. just because ISO does not (formally) have a reference implementation concept, it does not mean that the OASIS could not have a reference implementation, if the OASIS rules allowed that.

Readers: Readers may be interested in my blog on conformance Classes of Fidelity for Document Applications.

News Topics

Recommended for You

Got a Question?