Deliberate non-conformances in XML Schema implementations

Really, how could it be any other way?

By Rick Jelliffe
August 5, 2010

From SAXON's Michael Kay, on the XML-DEV mail list today:

On interoperability, there are at least three reasons why you might get different results from different processors. One is because the specification leaves the behaviour of certain things implementation-defined (for example, whether a "dangling reference" to an unavailable component is an error, in the case where that component isn't actually used). A second reason is bugs - which should be fairly rare in reputable products. A third is deliberate non-conformances, which should be rare but aren't - sometimes vendors make the decision that they don't like part of the standard and they are going to do things differently. Unfortunately one of the most commonly used tools for XML schema development suffers particularly from this, and until users vote by taking their money elsewhere, the situation won't change. It's a case of buyer beware.

But how can the buyer beware about unnamed problems in unnamed products?

Michael says that reputable products will have only fairly rare bugs, but then his integrity forces him to raise the issue of deliberate non-conformance: but surely the adopters of one of the most commonly-used tools considered it reputable when they bought it? Where are the web sites with reviews about products that expose these things? How do would-be adopters find out? Why hasn't the W3C XML Schema group made the issue of deliberate non-conformance a key priority over the last decade: where are the changes in XSD 1.1 designed to fix the problems that cause vendors to deliberate non-conform?

When vendors refuse to implement part of a standard, it is often a sign of a flaw in the standard just as much as a flaw in the vendors: in particular it can be a sign of underlayering, undermodularity, over-complexity or some other unworkability.

The other example that springs to mind in this is Java, where the initial graphics system (AWT) proved insufficient so a new graphics system (Swing) was added, but two significant players decided to hook on their own graphics system (i.e. MS with J++ and IBM with SWT), which rather goes against the WORA design principle. Despite MS and IBM's motives (which may be as rotten competitively as they were reasonable technically, no flames please), the point is that the Java development process was not capable of adjusting Java in some way to cope: Java remained monolithic and has not thrived on the desktop, MS went off with C#, and IBM went SWT anyway spearheaded by Eclipse.

How could it possibly be any other way with W3C XML Schemas, given that it is an incoherent mess, with one thing tacked onto the next, where it is well beyond the point where major vendors are interested in supporting either XSD 1.1's latest barnacles or improvements? XSD is about five standards, tacked together, with half-baked ideas of type derivation that comes and goes, and with ideas of streamability that come and go, largely dictated by one sector of the XML industry (the DBMS vendors) seemingly in an effort to complexify XML enough that their product capabilities become relevant to the XML age.

You might also be interested in:

News Topics

Recommended for You

Got a Question?