Open Standards are no silver bullet

Don't run it up the flagpole if I cannot run it on my desktop?

By Rick Jelliffe
June 14, 2009 | Comments: 2

I am just reading academics Shah and Kesan's really excellent, air-clearing paper Running code as a part of an open standards policy. They have an interesting take on the Mass ODF episode.

Normally, I am not super keen on the running code argument: people glibly state that you should have several open source implementations before the technology is standardized: of course the more that a technology is proven and debugged before being standardized by some standard body, the more chance it has of having legs. So I was pleased to see Shah and Kesan focussing on open standards policy: what standards (i.e., from a voluntary standards body) should a government mandate, especially for purchasing. It is a critical distinction that eludes many.

Some good quotes to give the flavour:

As we later point out, these advocates are assuming an open standard will lead to a vibrant market of multiple independent interoperable implementations that removes vendor lock-in. A review of the Massachusetts open formats policy shows how this blind faith in open standards can lead governments awry.
Despite ... argument that the combination of ODF and OpenOffice.org would bring consumer choice and freedom, there is little evidence of any significant cost savings or increased flexibility that has arisen with the open formats policy.
The key problem with all of these definitions is that they focus exclusively on the development of an open standard -- for example, ensuring that the standard was developed publicly and is freely available. This has led to a conflation of the definition of open standards with their consequences. In his prescient article on open standards, Joel West (2004) warns that the definition of open standards may be confused with the consequences of standards, i.e., multiple implementations.
Our view is that there are several factors that can explain the conflation of the definition of open standards and its consequences. First, there is a natural conflation between open standards and open source. ... With an open source project, it is implicit that an implementation exists; however, the same is not true for open standards. ... Second, for simple open standards, implementation can be trivial and taken for granted. ... Third, vendors may try to conflate an implementation with a standard.
A running code requirement would have led Massachusetts to defer adopting ODF.

And their conclusion?

Open standards play a vital role in software development and adoption. The advantages of open standards make it reasonable that governments will seek to adopt open standards. However, the Massachusetts experience suggests that governments must not judge open standards by how they are written, but by how widely they are implemented. After all, without multiple interoperable independent implementations, i.e., "running code", governments may find themselves suffering from lock-in to an open standard solution.

The running code requirement is not new, but it has been forgotten by governments in their rush to adopt open standards. Adding a running code requirement to an open standards policy puts an emphasis on how the standard is actually being used. We believe if adopters of open standards insist on running code, software developers and vendors will further support open standards and their interoperability. The result will be an array of economic and technological benefits.

I would go further than Shah and Kesan. I would say that, for mandating standards for software procurement, the objective test for "running code" should be filtered through the ISO 9126 Software Quality Assessment lens.

Take the issue of maturity: some projects will not require a mature standard, and mature implementations of the standard. Some will.

Or the issue of software functionality: some projects require full fidelity interop, some don't. This is a point I made in my blog Classes of fidelity for document applications.

In the terms of IS9126, the use of an open standard relates to a document's external quality. But the other two classes of quality, internal quality and quality-in-use, also are factors that don't just disappear. Acceptance testing and trials happen.

So I don't think I agree with Shah and Kesan's minor assertion that governments "have forgotten" these things. The rubber hits the road sooner or later. I seem to have a much less panicked view of all the problems in ODF and OOXML than other people: maturity of standards and their implementations is not a fast process: for a start, unbiquitous adoption requires the retirement of the previous generation of technology, and five or six years may be a good ballpark.

Continued pressure by governments on the major vendors is necessary, whether it is Sun, Corel, Adobe or Microsoft.


You might also be interested in:

2 Comments

While I agree with the general thrust of the argument (basically: a minimum of two independent, interoperable implementations will shake the 'bugs' out of both the standard and the implementations) I'm baffled by their ridiculously poor choice of catchphrase.

"Running Code" has now been redefined by them to mean "two or more independant, interoperable implementations". Even this blog gets confused by this egregiously poor bit of jargon appropriation. "Don't run it up the flagpole if I cannot run it on my desktop?" says your subhead, something that is a total non sequitur given their definition.

They also seem very vague on the impact of open source implementations of standards on vendor lock-in. Maybe open source wasn't the focus of this particular paper but since lock-in was it seems a curious omission. Which also rather undermines their headline grabbing claim about ODF.

Good point, both about the choice of catchphrase and my subhead.

"Running code" of course is associated with the IETF regime of rough consensus and running code, which is very pragmatic and works well for small, layered software (the kind I like.)

If we take it that the document is the protocol between a producer and a consumer application, then certainly "running code" requires two notionally different applications. (But the IETF usage is more about having a kind of reference implementation that ensures feasibility, I suppose.)

I have written in my blog and elsewhere that I think there needs to be an idea "open technology" which is an open standard plus open source implementations (and small and modular.)

I think that just as open standards processes are not enough (there also needs to have been a balance of interest) their argument that open source implementation is not enough (there needs to be substitutes.)

I don't think they are really concerned with shaking out bugs, but in having competition and choice (with bug-shaking hopefully being a side-effect, though it certainly may not be so...)

News Topics

Recommended for You

Got a Question?