Doom and gloom seems to be the order of the day. The stock markets seem to be making daily 4% to 5% drops, followed up by the very occasional 5-6% surges (perhaps the epitome of random walks happening within a vector field, but that's probably getting a little esoteric). The US automotive industry appears to be on its last legs, the investment mega-banks that are trying to pretend now that they are commercial lending institutions seem to be disintegrating nonetheless, and even in the tech sector companies large and small are shedding workers prior to going under themselves.
It's been a long time since most sectors of the US economy have seen a recession of this magnitude, and comparisons between the collapse of 1929 and the collapse of 2009 are inevitable. In both cases, the economy seized up because malinvestment, poor oversight and greed-induced-stupidity finally caught up with the titans of finance (and their political agents), and the velocity of money ground to a halt.
So does this mean that the world's going to start turning a faded sepia and proceeding along in the hurky-jerky frame-rate of ancient box movie cameras? Whither the railroad hobos, the prohibition gangsters, the gumshoe detectives? I want to see Okie sharecroppers in threadbare overalls staring at sand-storms, dammit!
For all that there are some fairly haunting similarities between the thirties and the twenty-oughts (I wish SOMEONE would figure out what you call the first decade of a new century), there are also some profound differences, and these differences are very likely to shape both the current crisis and our response to that crisis.
One of the first things to keep in mind is that this is a global crisis. The 1930s was "global" as well, but that globe also had about a third as many human beings on it, and the global economy was about a tenth its current size, and overall the various economies were at best only loosely integrated (and that went specifically just for the nations of Western Europe - most of the world was effectively on their own internal economies).
In 1930 commercial air travel was in its infancy, the telephone was still very much a luxury item (only about 20% of the US population owned a telephone in 1929) and many parts of the country were not electrified yet. It was possible (just) to be able to get from one end of the US to the other on an automobile, but most of that involved travelling on dirt roads or, if you were lucky, gravelled stretches. The average twenty-one year old had only a high school level of education (if that) and only about one in eight people of that age had ever been to college. The US was very much a resource economy - producing food, oil, wood, metals, and so forth, and many people in Europe felt that US made finished goods were far inferior to what could be produced at home.
Fast forward to eighty years, and the world is astonishingly different. The global population has tripled (and is going to go up by another 3-4 billion people by 2050, by most estimates), and the economy is far larger and more complex. The US is no longer a resource economy (though its neighbor to the north, Canada, most certainly still is), though it is also no longer a manufacturing economy. That's not to say that the US doesn't have manufacturing - even in an economy that isn't a "manufacturing" economy, the US still produces nearly a hundred times the amount of finished goods (in constant dollars) that it did eighty years ago. It's just that the most significant things that the US producing currently are ideas.
Now this may seem non-intuitive. You can't eat ideas. You can't wear them, you can't power your car with them, you can't shelter out from under the weather with ideas. Yet the fact that the US has created both a large commercial software sector as well as shadow "open source" sector, has established itself as the primary producer of entertainment (including the increasingly virtual worlds entertainment sector) and continues to play large roles in the technological infrastructure space (especially on the network protocol side) indicates that those ideas do have a certain fundamental substance to them nonetheless.
I think it can be argued that the biggest damage that the last eight years has wrought has been to our idea-making facilities. This is not an ideological argument - after the World Trade Center buildings fell in 2001, the country, from its leadership to the man inthe street, became fixated on security. The problem with security is that while a certain level of security is prudent, too much of it stifles communication (and hence innovation). People who are fearful do not reach out to others, but instead withdraw inward.
Moreover, too much emphasis on security means that you become risk averse - and as such become vulnerable to hucksters who sell you on risk mitigation tools and risk-free investments that prove in the end to be far more risky that one thought because risk - informational uncertainty - can never really be removed from any transaction ... it can only be transferred. In the end, someone has to pay for that risk, however.
The best ideas are risky ... and often are not necessarily beneficial to the originator. At O'Reilly we recently had a discussion about the distinction between invention and innovation. Invention, the creation of truly novel ideas, especially the paradigm changers, is comparatively rare. It requires focused dedication, persistence, intelligence and a willingness to fight the status quo. This is because the status quo - our society overall - is resistant to the idea of change, and inventions by their very nature bring change.
Innovation, on the other hand, is much more incremental in nature, and represents the improvement of existing foundational ideas in order to apply them to different domains (or markets). Innovation is not disruptive, is for the most part fairly predictable, but comes at the cost of diminishing returns.
Consider, for instance, the cell phone. The cell phones is an invention that (like most inventions) has obvious antecedents. Radios of course, have been around since the pioneering work of both Guglielmo Marconi and Nikola Tesla in the 1890s. Yet the invention of the cell phone, which is of based on the radio, required the development of small, light-weight and powerful batteries, and solid state electronics, the deployment of sufficient tower cells and a particular 1960s science fiction show in order to come into reality. It also required a change in regulatory stance with the FCC, something that took from the late 1940s to 1967 to accomplish.
Even given that, cell phones didn't become a practical reality until 1973, when Dr. Martin Cooper of Motorola initiated the first cell phone call. Pointedly, the call was made to his technology rival, Joe Engel, Bell Laps head of research. This also suggests (as does the case of Marconi and Tesla) that competitive one-upmanship may also be a critical factor in invention.
For all that, going from the brick-like phones of yore to the flip-phone communicator of Star Trek would take another twenty five years, and from that to the iPhone - in which the ever increasing sophistication of the computer chips within mobile phones have eventually reached a stage where they essentially become computational platforms in their own right - took another decade. Each of these are innovations, improvements of the core concept, but we are reaching a stage where the effective value of each innovation is smaller than the one before it.
Note the year, however, that the cell phone came out - 1973. This was a tumultuous time for business - oil prices had shot up to stratospheric levels, high inflation was becoming accepted as normal and the economy was definitely in the doldrums. In the mid-1970s, there were comparatively few "professional" computer programmers, and the ones that did exist were typically classed either as a rather oddball form of clerical worker or as "computer scientists".
Very few inventions are completely novel. The cell phone's most obvious antecedents were the car phone and the radio, just as the radio's antecedents were the telegraph and the telephone, and the conceptual idea of the cellphone (and the radio, for that matter) had been around for years or even decades before either of them appeared. Yet typically in order for invention to happen, the technology has to be in place to support it, and moreover what's needed more than anything else is the time to make mistakes.
Invention occurs when the downside to taking risks is lowest, which typically occurs at a time when the expectation of making money from the invention is also lowest. This was perhaps the single biggest flaw in both the incubator concept of the 1990s and the disappointing number of truly innovative concepts that come about when developers have to confront third quarter earnings.
Typically, in both cases, what you have is an investor who has gathered a bunch of "idea" generators together with hopes that they would create a world-changing new technology (whether the investor is a venture capitalist or a company CEO investing in R&D is besides the point). These idea-generators may put forth a lot of ideas, some very good, but none of them good enough to become a game-changer, and the investor begins to watch as the money he has invested begins to disappear. As the money gets closer to the empty mark, the investor becomes increasingly impatient with his idea-generators, and starts to restrict their ability to make mistakes. Eventually, the atmosphere becomes so toxic that the ones most likely to make those game-changing innovations leaves (and in many cases, once released from the confines of expectations and allowed to experiment on his or her own, comes up with the very game changer that the investor was hoping to see).
The open source community has learned this lesson repeatedly over the years. Many of the most inventive projects in the open source community came about because the developer needed to solve a problem, and having solved that problem recognized that there may be other things that the solution can be applied to. Solving that initial problem was the true invention, and more often than not it was not done under "sanctioned" channels, but was done at a time when, ironically, the pressure to innovate was lowest.
This is an instructive lesson to keep in mind as the economy continues to fall apart. If you measure your personal success by the amount of money that you have, its going to be a hard time - we are in a deflationary spiral right now that will have a fairly dramatic effect upon nominal wages. The money that you make in your next job will possibly be less than what you are making now - maybe much less. Of course, this will be true for most people, and there are a number of indications that relative wages will also in the long run become more equitable as a consequence. Moreover, as money drains out of the system, those dollars become more valuable. Thus, as painful as deflation is in the short term (and it can be devastating for many) once the system reaches an equilibrium wages and compensation actually tend to be better in relative terms.
Be that as it may, technologists - both those who create the technology and those are most adept at using it - are also by nature problem solvers. We are entering into an era that military types would describe as being a target-rich environment ... there are problems all over the place. When times are good, there are generally fewer problems that need to be solved, and as such people trying to make money from technology try to manufacture problems that their particular innovation can solve - when in fact the only real problem involved is that they people don't have as much money as they'd like. The reality is that when times are good, most people's responses to solving difficulties is to "throw money at it" - which usually ends up deferring the actual resolution of the problem by making it someone else's.
When times are bad and money becomes scarce, the problems typically become much more intractible, and throwing money at the problem becomes both much less attractive. Ironically enough, this is probably a good thing - the money serves only to get in the way. This is a time for inventors.
There are problems all across the spectrum. Climate change is an intriguing one, because we face the reality of having to decrease carbon footprints while demographics have already baked in a further 50% rise in population between now and the middle of the century. Behavioural changes can make some difference there, but changing societal behavior patterns can take an incredibly long time (on the order of decades or even centuries), especially when the changes involved happen on a time-frame that, while blazingly fast on an ecological scale, still appears slow to human awareness. This means that we need to apply technology not only to handle the physical problems (reducing the production of effluents) but also the behavioural issues.
Similarly, energy and water issues (which are related to one another and to climate change) require that we apply technological sophistication and novel ways of thinking to the process of insuring our energy and water needs are met. The goal here is not to make a profit, but rather to solve a very real problem (though the best solutions often have the potential to generate profits, just not necessarily from the most obvious direction).
On a societal level, we're faced with problems of similar scope. It's becoming obvious that the economic system that was built around the needs of a post-World War II global economy is no longer effective in an age of near instantaneous communication, floating currencies and a still growing overall global population. In the past, the changes that were made were made by economists, typically with strong political biases. We as systems theorists need to recognize that the economy is a system, and like most systems, it has the ability to be shaped and programmed, but at the same time that it is a very non-linear system in which risk and value play very real part. Perhaps it is time for geeks to start viewing themselves as factors in the economic sphere, as they are typically the ones called upon to build the pieces of that system in the first place.
Similarly, the whole notion of work and compensation is undergoing a profound change. The days where people drive sixty miles each way every day in order to sit in an office are coming to a close as being terribly inefficient by contemporary priorities (even if they made sense at some point in the past). The way that governments tax people is also out of kilter with the increasingly distributed viewpoint, as are the nature of many of the jobs (or the values provided by those jobs), yet we continue blithely to base policy upon models that are no longer relevant.
Our educational system is similarly out of kilter, because the educational establishiment itself has not even begun to come to terms with the concepts that we live in an information-rich but context poor world. If the strength of our society comes from our ability to manipulate information to determine the point at which invention becomes feasible, then not teaching these skills is not only counterproductive but ultimately self-defeating.
The emergence of open source technology, open standards and agile methodologies is no accident. Rather, it has emerged out of the spirit of inventiveness that is only peripherally related to the investor's desire for profit, and indeed that it is a necessary survival structure for a post-capitalist society. They are themselves inventions, as much as the radio and cell phone, that we have created in order to handle our needs at this stage in our society. Existing structures - governmental, business, political - are either dis-integrating or (where agile enough) are reorganizing in order to accommodate that same perceived need.
Our role, as technologists, is to facilitate that change, to make society agile enough to respond to the things beyond our control and to intelligent shape those things that we can control. This need isn't going away simply because the paycheck is going away - if anything, the opposite is true - we have unresolved problems in our society, and until we learn how to reprogram that society to better handle those problems, the paycheck will likely not be back anytime soon.
Kurt Cagle is a writer and technologist, and is an Online Editor for O'Reilly Media.