Analysis 2009: Syndication forms the backbone of the Writable Web

By Kurt Cagle
January 6, 2009 | Comments: 2

The syndication model has long been a major facet of the way that the web works, but for the most part its been a largely single direction notification mechanism - you publish content, this updates a syndication queue, then next time you query the queue you'll see the new content.

Increasingly, however, programmers are beginning to recognize that "news feeds" are messages, and as such they can be used as a way to initiate actions elsewhere. One facet of that, hinted in the previous section, was the notion that you can in fact contain messages - documents - within the payload of a given Atom entry that can in turn create or update a collection of such content. Moreover, one area where most SOAP proponents are also moving is the idea that if you do work with message queues that it becomes the responsibility of the queue processor, not the publishing system, to take that content and handle the interpretation of the content based upon other factors, typically those contained in the message itself.

Jeff Barr of Amazon recently pointed out a new service that I find fascinating - This service uses a combination of email (itself a messaging service) syndication feeds and social network APIs to make it possible to create "programs" in which you can do things like post an email with a picture enclosure to a drop box which will then update that image to Flickr, send out a twitter to that image and send a confirming email back to you, all through various syndication services (I tried it myself and found that while some of it works, a few components are still in development).

Intriguing to me is that Tarpipe represents the concept of the mashup in reverse - rather than using client components within a web page or widget that draw in external data feeds, Tarpipe effectively creates a "mash-down", creating complex orchestration across disparate services using feeds. I expect to see more of this (and more concern about the security implications of what this represents). The irony here is that what I see emerging in all of this is ultimately what SOAP and Web Services promised in late 1990s and early 2000 - the ability to control and update complex distributed systems via messaging - albeit with a different set of messaging protocols and a very different architectural vision.

Syndication is also at the heart of the writable web in other ways. The AtomPub protocol, which I wrote about extensively in last year's analysis, is definitely getting picked up now by players all across the industry, from Microsoft to Google to IBM. At the 2008 Web Services East conference this year, I was gratified to see IBM showcase a syndication server, built around Atom and AtomPub, that figured very nicely in this model (I believe it is now out as WebSphere Portal, but I'm not sure), though there was still too much strict client/server there.

I think one big story in 2009 will be the rise of orchestrated synchronization. Orchestration, getting asynchronous processes to work in a fashion that still ultimately produces a meaningful result, is hard, especially when those processes work across a network as broad as the Internet. I think pieces are emerging there now, however - watch the W3C XProc Working Draft in particular ... while XProc still tends to be a little too tightly focused on synchronicity, it is beginning to break out of this viewpoint as people start thinking about the asynchronous side of things.

I met two of the three editors of the spec for the first time, Norm Walsh at the 2008 MarkLogic Conference in San Francisco, Alex Milowski at Balisage 2008 in Montreal (I've corresponded with Henry Thompson of the University of Edinburgh over the years, but haven't yet met him, alas). Each of them see XProc as fulfilling a very real niche by not only making it possible to create pipelines (and to blackbox pipelines for abstraction purposes) but also to make people understand that ultimately such pipelining architectures represent the real future of messaging and syndication services.

XProc is in its Candidate Recommendation status now, so, unless something radical changes between now and then, its likely that XProc will end up being a full recommendation by March 2009. On its heels is another W3C spec worth looking at - the Service Modeling Language (SML), which despite its name (the W3C really needs someone coming up with more appropriate names) is actually an official endorsement of the use of Schematron within the W3C XML Scheme Definition Language (XSDL). Schematron, written by O'Reilly blogger, standards expert and avowed hermit Rick Jelliffe (sorry, Rick, couldn't resist), provides a constraint language to be used in conjunction with other schema languages that makes it possible to use XPath to determine whether a given element or attribute pattern holds a business-logic valid value. Combine SML/Schematron with XProc and you have the ability to create extraordinarily rich pipelines capable of processing according to dynamic business rule changes.

Expect to see a number of working applications built around these principles well into 2009.

One final area to watch in this space is Twitter. While I'm not exactly a hard core twitterer ( I've begun trying to make sure that I pass on those cool things I find each day over twitter, not only as a means of communicating with others but as a relatively painless way to bookmark these with context. I see twitter in conjunction with tarpipe being a powerful tool - in essence you can launch cascading services directly from your browser (I actually have two twitter editors in Firefox that I alternately launch from). Again, while I think that twitter will stay largely confined to the blogospher, its utility as both status message carrier and command line shouldn't be overlooked.

You might also be interested in:


Regarding possible future use of syndication technologies/transports (Atom/RSS) for complex mission critical messages or transactions, I find some of the mechanisms (currently) missing, which are found in today's TDN's (Transaction Delivery Network) technologies are delivery assurance, security/encryption, and error recovery.

As these shortcomings are addressed, it's likely only a matter of time before this form of message transport, based on widely adopted RSS/Atom architectures, become mainstream.

I've recently seen some very interesting things done with a combination of Atom and XMPP. I agree that Atom over HTTP lacks both delivery assurance and error recovery, although I think a compelling point can be made that Atom over HTTPS does address security/encryption. However, the possibility of using XMPP networks for primary transport with Atom acting as Semantic messaging transport answers both delivery assurance and error recovery.

For that matter, Atom over SOAP has similar properties, and moreover unites those characteristics that SOAP has (autonomous messaging) with the Publish/Subscribe characteristics of Atom/AtomPub. In both the XMPP and SOAP Messaging cases, Atom is effectively tunneling through the other protocols.

News Topics

Recommended for You

Got a Question?