Every so often I entertain rather disturbing concepts as I look over the large and varied landscape of computer programming, and this particular question "Are computer languages irrelevant?" popped up one day as I was thinking about clouding computing, web services and the landscape of code that we all as programmers find ourselves deeply enmeshed.
Certainly the question "which is the best computer language" is likely, if thrown into a room full of programmers, to result in the emergence of only a couple of those same programmers, bloodied and battered, to leave the room alive, and then only so long as you don't get into different language versions. Yet increasingly I'm wondering if the particular languages themselves really matter all that much.
In other words, the advantages of using a language that provides detailed user control over memory management and garbage collection (and as such let developers move "closer to the metal") is outweighed by the simplicity of the other languages ... or put another way, the utility of using scripting languages is usually far higher in terms of productivity in the development and maintenance of applications than the inefficiencies of the code - the performance factor isn't in fact a factor.
Certainly there are plenty of counter-examples where performance is a factor, but over time, they're beginning to become rarer on the ground, compared to factors such as bandwidth and network latency. Most desktop applications written today have a direct pipeline to the web, either for updating state, storing entities, communicating with others via a server or insuring the validity of the user. Not a few of these "web-enabled" applications are being replaced, in fits and starts, with true "web-apps", written to run within a browser, albeit one with its own localized data store.
The history of computing is a history of abstraction and virtualization. Most contemporary programmers are probably unaware that the earliest programs were completely linear - the idea of procedural modules was initially an operation of convenience in order to minimize duplication of code, the association of a name (and from there a series of slots representing input and output parameters) with a given set of instructions.
Object oriented programming is often seen as the next stage of that process, but most people fail to appreciate that the real "next stage" was the deployment of discrete modular components that exposed specific properties and methods according to a common standard of integration. In its own way, Visual Basic was perhaps more important than C++, because it most clearly established the principle that if you build that common framework, you can drop in anyone's component, not just ones that you write or that come "out of the box". Thus, componentization of software can be seen as a layer of virtualization.
HTML helped to embody that concept perhaps better than anything else to date. There is nothing intrinsically magical about an HTML document - until such time as it is reified as a web page. It is a pure abstraction, and the mechanism that converts that abstraction into something of substance is some computer language running the browser.
Yet the wonder of that HTML page is that with usually with the exception of very minor discrepancies, that same document will render the page in more or less the same way, regardless of what language is used to implement the rendering and functionality. The reason that this is the case is of course the fact that the world has standardized upon the expectation of behavior for such documents, and as such the rendering language is largely irrelevant. The qualifications I'm giving here are only to reflect upon the fact that this seems to be a trend, and not all browsers conform to these standards yet (though most are giving it a good try).
Virtualization of the operating system is also taking place. Most languages written this decade do not talk directly to the underlying system. Instead, the language is intended to run a virtual machine that in turn performs the execution at the lowest level. Web browsers are employing this strategy as well, to the extent that web browsers are becoming increasingly difficult to distinguish from operating systems in terms of their abstractions - they have to be in order to support their own abstractions across platforms.
Web services represents the other aspect of this virtualization, where the term describes network based componentization: localized client components embedded within applications that communicate with the server as a data abstraction. Mash-ups are of course a very visible embodiment of this principle, but the componentization also takes place via server components that are replacing the flat "scripts" of web development a decade ago with distinct components that as often as not are also represented by a declarative abstraction.
With both client and server, the componentization clearly demarcates a boundary between internal component code and glue code. As components proliferate (and the standards for integrating these components coalesce) the imperative glue code in turn disappears, in fits and starts, admittedly, but the direction is clearly there. With component standardization, the language that the component is written in becomes secondary to the expected behaviour of the component itself - the computer language becomes irrelevant.
This same process of virtualization is now taking place at the hardware level, and is fundamental to the principle of cloud computing. Create a hypervisor to manage each machine instance, and you can essentially create a virtual computer within minutes, each running its own distinct operating system with applications not only loaded but in many cases running.
The machine - the hardware - becomes irrelevant, an abstraction; you as a developer or IT manager are far more likely to be concerned with the virtualized machine, not the physical one. Already, standardization is taking place in the virtualization market, providing consistent standards that will make it possible for a given virtual computer to be created not just on company X's infrastructure, using their memory and processing systems, but on company Y's infrastructure as well utilizing the same web interfaces - in a manner that makes the transition from X to Y invisible to the end user.
If this model sounds familiar, it is - its basically the same model that public power utilities have used for handling load surpluses and draws with one another. This makes it possible to create system instances, and as demand picks or drops, underutilized computer farms can pick up the slack.
Yet the point to keep in mind with all of this is that throughout this decade and into the next, localized and specialized "one-off" instances in imperative language A vs. language B are also going to disappear in favor of standardized abstract declarative descriptions in which behaviour is standardized, not language.
This doesn't of course mean that the languages "go away". Declarative state diagrams are meaningless without having the ability to turn those diagrams into some formal (imperative) behaviour ... but the specific languages used to implement these behaviours are irrelevant to the overall discussion of the system.
This obviously has relevance to tech publishing (as well as to development, job seeking, college curricula and IT strategies) in the very near future. Componentization by its very nature means that you have a third party industry of component development at every level of abstraction along with a large floating pool of integrationists (a.k.a., mash-up artists).
Additionally, the more componentized the industry and the higher the degree of abstraction of the componentization layer, the more likely that these same integrationists will not necessarily need to be programmers at all, but instead will be "power users" who are also domain experts in their respective industry verticals.
They will need to know their domain specific languages - some XML representation, most likely - but even here they will likely not be exposed to the markup language of that aspect. Most XML component models tend to be composable, which in turn means that they can usually be rendered neatly via any number of "drag-and-drop" graphical UIs or forms/editor representation. Thus, while it is still necessary (indeed, far more necessary) that the user understand the specific domain model, their need to understand that model on a programmatic level is going to continue to decline.
The effect of this in the long term is to continue to grow the "designer" at the expense of the programmer, but you will also see the rise of ontologists (data modelers) who serve to model the respective domains in the first place. The closest analog to these roles in most existing IT shops is the application architect, but what are being architected are not code APIs but domain specific languages (though it can be argued that the one is essentially an early form of the other).
This is certainly an ongoing process. Are the differences between Ruby and Java important today? Yes. Are they as important as they were five years ago? No. Will they be less important five years from now? Almost certainly.
Distributed services, componentization and virtualization are the hallmarks of the emerging generation of programming. While cloud computing is certainly a part of this phenomenon, that "cloud" is not just the server layer, but is also the dynamic shifting sea of mobile devices, laptops and intelligent objects. In order for these to communicate with the cloud and one another, building around a specific computer language is not only irrelevant but potentially a barrier - the mechanism for communication is the standard protocol. This in turn informs the methodology of software development, and from there the business community that makes use of this technology.
Kurt Cagle is an Online Editor for O'Reilly Media.