Ken Krechmer has been one of the leading proponents of the idea of Open Standards: his ideas are frequently cited. He made some typically interesting comments recently which I think are worth having wider currency:
Microsoft exerts control of its software APIs in many ways. One of the more significant is that Microsoft can change such APIs at any time with on-line updates. This prevents other companies, unless supported by Microsoft, from maintaining compatibility with the Microsoft APIs. So any standardization process that hopes to offer "openness" must provide a means to control such updates as part of its process. I don't think this point is addressed in the papers I have scanned.
Taking a technical approach to solving these problems has a much higher chance of success in the near term. OOXML and ODF is a sad state of affairs in the standardization world, but not for the reasons often noted. Fighting to create a single standard is an outdated standardization approach. When memory is very cheap it becomes practical to support two or more ways to implement compatibility (OOXML and ODF are both standards for document format compatibility). What is necessary is a standardized mechanism to identify, negotiate and select which way compatibility will be achieved. I term standardized mechanisms that support all three functions (identify, negotiate and select) "adaptability standards." The Fundamental Nature of Standards: Technical Perspective (http://www.csrstds.com/fundtec.html) offers one lower layer approach to adaptability standards.
In internet connected systems, once adaptability standards are used it may not even be necessary to have compatibility standards.
Krechmer's Adaptive Standards pre-suppose the kind of frameworking and support for modularity that I have been banging on about for the last decade: see How to promote organic plurality on the WWW and the recent The Cathedral and the Bazaar and Standards.
As I understand it, Krechmer's Adaptive Standards approach implies that not only should you have standards organized into frameworks and modules, but that for internet frameworks, if the standard specifies how or where to down load the appropriate converter/handler module (like a codec), then you actually don't really need the standard for converter/handler. The framework is all that is necessary.
I currently cannot go nearly that far: that document standards need to have lives that outlive any particular system pretty well put the kibosh on it. However, I certainly do agree that concentrating on the adaptive standard as the bones on which to hang the flesh has to be the way forward. (This is also true of software.) And I am not sure that HTTP's mechanism for content negotiation has been any kind of success, as an example of perhaps the things Krechmer is suggesting.
The price of memory argument seems strange, unless it means that there is no reason why document formats cannot contain different components in alternative standards at the same time: this seems similar to the point I made in Can a file be ODF and Open XML at the same time (see also Harmonization by augmenting ODF with OOXML elements.)