What is Great About the Web

By Eric Larson
October 20, 2008 | Comments: 6

I'm not sure many people really understand what is truly great about the Web and why it works. Most developers see the web as a technology platform and nothing more. HTML, JavaScript, and CSS are simply tools that must be used to satisfy requirements. The sad part is that this lack of understanding will cripple the web and more importantly, hurt users.

What has happened is that as the web has become mainstream in terms of developers, there is an understanding of the technology. The positive side is that coding to web standards and providing robust applications is commonplace. The negative side of the equation is that applications have become constrained by the current state of the art, and with design patterns, best practices and architectures essentially decided, there is less and less innovation.

The lack of innovation has come decidedly from standards bodies. It was clear the web became popular and powerful by adherence to standards. The standards made it possible for browsers to happen. Everyone agreed (at one level or another) to work together and the Web is the result. The problem is we are continuing to look at standards bodies for the innovation. The has obviously not worked as the progression of the web has once again been driven by innovative ideas and libraries rather than new standards.

Looking back, what was great about the web was that everyone agreed. Whether or not the people involved wanted to agree, the result is a set of standards that effectively became the de facto laws of the Web. JavaScript is how you write client side applications. The DOM is the way you access elements on the page. The page is made up of (X)HTML. RSS and Atom have effectively added as de facto standards for XML representations of data with JSON being another up and comer. The great thing about the web is that through the browser, everyone more or less agreed on a set of technologies and it all just worked.

The issue now is that the web is ready for a true version 2.0 in terms of the de facto web APIs, but we are having trouble moving past the second system syndrome. The standards have only become bloated and have added little value. There is now a legacy requirement that was not present when the web first emerged. Generally, it is tough to see where the web is headed and how to get there.

The positive side of all this is that JavaScript seems to be language bringing the "Great" back to the Web. The major JavaScript libraries have managed to make browser incompatibilities a manageable problem and what's more, they are actively trying to push potential standards through practical implementations. jQuery is easily the most influential library in that it has changed the chosen query mechanism of the page from the DOM to CSS selectors. It has also embraced the functional aspects of JavaScript, helping to reveals its quirks as actual valuable features. Most importantly though, common APIs are surfacing between the different libraries. This is having an effect on browser development, which will eventually force standards bodies to do what they should have always done, document the current standards found in implementations.

At the end of the day, every developer can help to improve the web by making an effort to try something new. The web is important because it has changed the lives of users. That is far from being an entirely technical task. Innovative user interface ideas and new methods of dealing with data all help to improve the state of the art, and more importantly, improve the web for users. Even though some of the core technologies of the web are showing their age, it doesn't mean we as developers can't make an effort to continue to improve the usefulness of the web for users.


You might also be interested in:

6 Comments

Sigh... one more time.

By hijacking the word 'standards' and diluting the meaning for the sake of marketing power, the original web supporters guaranteed innovation would grind to a halt.

o Standards: for technology that has reached a level of market penetration making it effective in terms of costs and technology to regularize it both in the sense of working out the incompatible bits and enabling procurement authorities to cite it.

o Specifications: for technology where a set of requirements has been formally declared to enable any group to propose and develop an implementation that can provably meet those specifications.

The web engineers have no one but themselves to blame for the slowing innovation in an environment where feedback is almost instantaneous and reach is at 100% for all affected parties.

Hey Eric,

I'm really sorry to say that after thoroughly reading your article I found little value to it.

You basically state one idea in many many lines: "standards are great, but they hold back innovation". Moreover, you did not quite argument on it, it pretty much sounded to me like a circular definition.

Moreover, I can hardly be convinced that standards are not, by themselves, a factor that encourages progress, in the first place. Having a stable ground to build on isn't going to encourage you to build something bigger, richer, etc?

@claudius iacob

The problem with relying on the standard to innovate is that there no direct tie in with actual users. Sure, companies like Microsoft or IBM want a spec to adhere to in order to feel confident they are not wasting time, but relying on a waterfall-esque methodology for your innovations doesn't seem have panned out in terms of creating new and innovative products.

It is definitely possible to create great technology via standards, but I don't believe it is the rule. The best way to find an audience is by producing something people want and the recent specifications offered by standards bodies have something to be desired in that regard.

If it is a real standard, the market has already found it useful, appealing, and accepted. That is why it is 'standard' and not just 'ratified'.

A standard can certainly build a foundation for innovation as long as the innovation can be marketed and evolves separately from the standard. It is the coupling in which no product can change unless the standard changes first that bedevils innovation. That is why the smarter markets apply both specifications and standards and separate them both by process and by the expectations set for each within the communities to which the innovations are marketed.

The web community has damaged the expectations of standards by the market. In so doing, it has hobbled innovation of its products, but it is the community that does this, not a standard, something which one is free to ignore as long as no certification marks are applied.

@len

You do bring up a good point regarding the opportunity to ignore. The problem is ignoring the standard effectively invalidates it. This is different from something like RFC 2119 'SHOULD' semantics which allow for a way to stray from a specification detail.

Your point about placing blame on the community for some of the issues associated with standards is also spot on. The only thing I would add is that a standard is created from a specification, written by a standards body, which is supposed to be members of the community. I'm not suggesting the community is not to blame, but rather the standard bodies are part of the community, so the distinction is rather gray.

The standard may be a follow on to a specification but a specification should be implemented and live for awhile before a standard is contemplated. My point is to ask why a standard is being written at all, who does it help and what does it standardize?

A standard is citable legally. That should be where it gets the most use: systems that have a requirement to operate or interoperate at a rigorous level of fidelity feature-wise. It shouldn't be used to kick-start a new technology or application. That's the job of a specification.

When I say ignore, I mean avoid it altogether. A test mark is a legal emblem affixed to a product which atests that it complies with the features so signified. I don't mean avoid features within a system and use a flag to denote that. That conflates the role of the standard and the application. A standard OR specification may provide such to enable levels of certified performance, but that is not the point of ignore.

When a consortium is formed for some market or product type, standards may be one of the things produced and authoritatively controlled by the organization. It may have different rules with regard to the IP of technical contributions. It may have policies and processes for taking IP/technology from members and including them in specifications and standards, and the rules may be different for both under the same consortium. These policies and rules should be expressed in the participation agreements for membership.

The web is not a web of standard applications. It is a web of competing applications. Of that set, some subset may be competing by complying with standards or may be implementation of public specifications. In short, a standard is not the place to begin development of innovative technology. It is the place to tame it. One doesn't create or legislate innovation.

Innovation is bred. Its parents may be standard and are certainly specified, but perhaps not both at the same time. And the need to take features from systems found in the wild and breeding them to the elite cultivar (the standard application) is certainly one way to bring innovation into standardized technology.

Keep in mind: first there were typesetting codes from different vendors, then Generic Codes (GenCodes) different vendors agreed to use, then a Generic Markup Language (GML) product from IBM with a specified means for creating system-independent codes, then a standardized generic markup language (SGML) independent of the vendor because owned by a non-vendor standards organization (ISO), then a retrograde application of that (HTML) that grew up in the wild, then a specification for a subset of that standard cross-bred with the requirements of the application grown in the wild, thus XML.

XML was created many times in many places before it was ever specified at the W3C. The status as a standard made little difference except in formalizing what was already being practiced by innovative companies according to needs which the parent standard did not meet.

This process did not occur because people didn't have the web or know better: it is because they did know better and the web made no difference except in providing the wild child.

News Topics

Recommended for You

Got a Question?