The derivatives crisis and standards

By Rick Jelliffe
September 28, 2008 | Comments: 4

We are awash with explanations of the current mess (and I don't understand why under-insurance isn't mentioned more often) but there is one that should cause XML and standards people to prick up their ears: the idea that the problem is caused by derivatives being too hard to value.

What I have found particularly interesting in my brief excursions into the dens of the technical beavers of large financial institutions is how much the move to standards requires shoehorning: these institutions typically have very large investments in transaction or closed-world systems with long life-cycles. Indeed, some of them the lifecycle is so long, and the facilities so mission critical to the business, that for all intents and purposes they will never go away.

Anyone who has implemented the ACORD insurance standard for XML will probably attest to how many non-ACORD extensions are necessary. And I have heard of a financial institution where mortgage requests from brokers get processed disconnected from information about which broker made the request: the broker has to re-send information in order to get billed; this is not because of some reasonable anonymising system for privacy or whatever, it is apparantly merely because their approval system was designed before the mortgage broker market existed and has never been upgraded.

book coverbook coverbook cover
For a complete list of XML books, visit the
XML Topic page in the O'Reilly Store.



We hear a lot of talk about Web 2.0, but has the financial sector even got to Web 1.0, really?

What do I mean by that? That there are businesses without websites? No, that is trivializing Web 1.0. To understand the value of the web for reporting information, it is necessary to take a look at Roy Fielding's REST thesis, and to take two key things from it: first, that data interchange should be rich, and second that everything important should be identified.

Information-rich data interchange

The reduction in the cost of memory and data transmission has naturally changed the economics of information, and lead to the idea that organizations can be (internally and externally) generous information with rather than parsimonious. The XML phenomenon has piggybacked and lead this change. Rather than having database reports with highly targeted and specific requests, to reduce the number of joins for instance, the idea is to reduce the amount of filtering at the DBMS side, and have data reports available with perhaps more than is necessary for specific tasks, but which thereby becomes useful for multiple projects and new, even spontaneous, uses.

I once spoke with a bank lead developer who said that once they started to make this kind of open, rich information available, they immediately had mainframe congestion problems, because it turned out there were many people in the organization who actually needed this information, but

for whom navigating the various procedure to get reports written was too high a cost. Now the traditional approach is to tut-tut, and to say, yes rationing information is a key function of database management; however, adding barriers to the agility of the business is certainly not a part of database management, hence the move towards XML!

The prominence of entrenched systems is a major problem with any standard XML data interchange, and it requires rethinking how how much interoperability is actually possible. In a recent evaluation of a large financial standard I was involved with, it came out that the whole basis for the standard was in fact dubious, despite the fact that the stakeholders were happy with how the standard was working in practice: the standard was based on data interchange with shared semantics from Agent to Institution, yet in practice it was used with several round-trips for rework and more information, a protocol, where the Agent would only fill in the minimal information with whatever they had to hand, and the Institutions would then grade this and return it with requests for more information as needed by their business rules. Despite being concerned with financial information, the schema was fairly untyped to accommodate this interchange.

When there are large, unchangeable entrenched end-systems, then having agreed semantics in a standard schema results in unusability: the end-system can only use information with semantics that it already has. So financial standards seem much more practical when the receiving system can be built according to the standard, and the schemas for the standard allow enough variation and fuzzy semantics to allow input from multiple systems.

XBRL (Extensible Business Reporting Language) is very impressive. (At least for the instance language: the XSD-based taxonomy language creeps me out.) It is particularly interesting in paying a lot of attention to making sure that, apart from the semantics or structure of the element, two issues get addressed: the first is that all measurements are connected in the same XBRL instance document to primary metadata (for example, the precision, the currency) and the second is that all facts in the XBRL document is connected (in the same in XBRL instance document) to metadata detailing the entity that that fact relates to (which company, which time period, etc.)

Making sure that data reports are never disconnected from this kind of primary metadata seems to me to be a pre-condition for proper electronic aggregation, where you need to be able to de-aggregate both for auditing and to allow disentangling of complex information (such as derivatives.)

Universal identification

The other aspect is of universal identification: I have been exploring this in the PRESTO columns in this blog.

Universal identification means not that every XML document should be available on the WWW with a URL, but that every significant piece of data at every significant level of granularity in an organization should have a clear, ubiquitous, hierarchical identifier regardless of whether the information can be retrieved using that identifier at any particular point in time.

Of course, in many cases it would be desirable to be able to retrieve some representation of that information using the PRESTO URL. But in other situations, it may be useful to know "We simply don't have that information to hand" or "here is the next-best thing we have" or even just to use the identity information as part of functional specifications.

If XML documents need primary metadata to prevent them from being disconnected from their contexts in transit, to allow robust message-passing systems (push systems), then universal identification is needed for robust addressable information systems (pull systems), such as typical DBMS acess and webpage access: for reliable mash-up: the agile and federated re-purposing of data.

Treasure-In, Garbage Out

Now I don't want to get too carried away here: Warren Buffet called derivatives "financial weapons of mass destruction" and "time bombs" and is looking wise. At a certain point of complex mixing, reliable valuation may be impossible regardless of the ease of access of component information and of price stability: indeed, my old accounting professor used to rail against accounting standards and laws which allow estimates of future earnings to be used today, as rendering accounts fundamentally non-objective (Prof R.J. Chambers and "continuously contemporaneous accounting".)

Unless there is an accounting standards emphasis towards objective valuations (such as fair present value), we can have all the good standards for financial data interchange we like, and it we won't have reduced society's risk nor improved evidence-based management. Technical standards like XML make it feasible to have detailed reporting; reporting standards like Sarbanes-Oxley help the oversight; but unless the accounting standards themselves are limited to strictly objective valuation practices, parts of our house can be built of cards.

(Buffet's earlier comments make interesting reading: pages 13 to 15 of the PDF)


You might also be interested in:

4 Comments

If you don't want the fight to spill out here on your blog, you should delete this quickly.

As much as I agree with that, Rick, nothing sensible will be done for a few months at least. The crisis is being manipulated in the US to meet the objectives of the election. Pelosi wasn't naive; she knew her speech before the vote would destroy any chance of it passing. She had advised members of her party in danger of losing their seats to vote against it.

She bought Obama the election with our 401ks.

This is dirtier than anything I can remember in American politics, and unfortunately, it has worldwide effects. The Americans lost 6.6% approximately. The Aussies lost around 4%. The dead cat is bouncing this morning, but the loss is permanent and the effects will take some time to recover.

We can do a lot with technology. We can't fix blind ambition in pursuit of the Great Get Even.

We can provide all the technology we want (as a 'hardware, software, and services' business); but the question of whether the client runs out of cash, or can run a profitable business, is a user management responsibility.

I assume there will be a meltdown, and a lot of pieces to be picked up afterwards; but I don't know the details. I don't think anyone does.

Any recommendations ? Any productive discussion to be had ?

The world will keep on turning, same as it did after the BRM.

But a lot of people will have a greatly different outlook on it.

Simplify. As always.

The roots of what happened are in the complexity of the instruments that made it so easy to hide the depth of the rip-off. I'm afraid XML isn't up to it or the XML design culture of today isn't. We would have to kill off a lot of the people who have been promoting the excessively verbose metaphorical document designs.

Simplify. As always.

The roots of what happened are in the complexity of the instruments that made it so easy to hide the depth of the rip-off. I'm afraid XML isn't up to it or the XML design culture of today isn't. We would have to kill off a lot of the people who have been promoting the excessively verbose metaphorical document designs.

News Topics

Recommended for You

Got a Question?