Southern California Linux Expo: freedom in a service economy, and more

By Andy Oram
February 21, 2009

SCALE is a comfortably-sized conference with a grandly scoped mission. It ranges from a beginner track where attendees can ask, "How does free software benefit me?" to a developer track that dissects kernel APIs and vulnerabilities. Business models, women in computing, and intellectual property are all discussed. And every session I attended had a full room. In this blog I'll discuss the things I heard today.

Keynote: Kuhn on application service providers

Bradley Kuhn of the Software Freedom Law Center (and formerly of the Free Software Foundation) devoted his keynote to the difficulties posed by Software as a Service and application service providers, which I recently wrote about. After decrying the tendency of social networks and services toward lock-in and forcing users to relinquish control over their data (often with undesirable privacy policies), Kuhn joined some other observers in suggesting the alternative of designing peer-to-peer, federated systems.

The two examples cited by Kuhn are Jesse Vincent's Prophet and the identi.ca microblogging service. The premise of Prophet is that each user keeps control over his or her own data. The premise of identi.ca is to propagate data to other services as requested by the user, and to allow the user to extract data easily. (These are both in very early stages, so documentation is lacking.)

In particular, identi.ca includes a protocol for distributing microblogs that other sites could use to interoperate. Currently, identi.ca allows a user to automatically repost microblogs on Twitter, but Twitter doesn't reciprocate.

Avoiding lock-in becomes complex in a Web 2.0 world of peer production. For instance, it's generally easy to extract data from your own home page on a social network, but not to go through and bring back your postings from all the friends' sites and forums where you may have put it. This issue is the reverse side of the recent attempt to change its privacy policy. On the one hand are the problems faced with users who want to port their accounts and personas to a new service; on the other hand are the problems faced by a service when users leave.

One audience member pointed to the notorious fickleness of social network members, who regularly dissolve and regroup on the latest, coolest service. Kuhn said an open protocol for data exchange would facilitate this and promote innovation by making it easier to try new services. p> I then pointed out that portals and social networks desire lock-in because they want to have more eyes in order to sell more advertising. I gave Kuhn an opening by declaring that the advertising-based business model is not sustainable for most services, and he laid down the position that software developers should make their first priority designing software that's good for the users, and then let business models be found for them.

Kuhn idealized the past a bit, in my opinion, oversimplifying the history of service development. He said that user freedom was not challenged by the first Internet services because they consisted of distinct software programs in a client/server paradigm. So long as you could run your favorite mail user agent (client), you didn't have to worry whether the server was Microsoft Exchange or Sendmail. Each side was free to make its choice, secure that RFCs would define the protocol for exchanging mail.

But in fact, there are a host of implicit contracts between the client and the server, going far beyond RFCs. Just taking up Kuhn's email example, we have all had the experience of sending a message that never got to its destination and not knowing why. Was it because we attached a file that exceeded a certain unstated size? Was it because our text included the word "specialist"? (That word contains a common prescription drug name as a substring.)

The flap over Comcast's restrictions on BitTorrent traffic show that the Internet is rife with unstated contracts and hidden violations of those contracts, going down to the TCP level.

Certainly I'd love to see the early promise of peer-to-peer (which was first applied to applications around 2000) fleshed out in a new generation of services. But the problems of peer-to-peer were already evident to researchers at that time. They are scattered through the book edited on that subject and summarized in two papers I wrote in 2004: From P2P to Web Services: Addressing and Coordination and From P2P to Web Services: Trust.

Further discussion of these issues takes place on autonomo.us and DataPortability.

Bacon on building community

Ubuntu Community Manager Jono Bacon (who is writing a book on community for O'Reilly) put some of the basic principles on which he operates into a one-hour session. Strong community, ironically, depends partly on honoring and promoting individuals. Someone who posts answers to forums, contributes software patches, or helps the community in other says should be recognized and allowed to build reputations. This is social capital that benefits everyone in the community.

A project must also be careful to constantly cultivate diversity: don't get stuck going back to the same people over and over for advice.

One of Bacon's most interesting revelations concerned how he views his own role as a community manager. He said that individual teams or local user groups have already built up reputations among members and the groups social capital; his main job is to bind together these teams by allowing them to share it.

He ended by taking this mission to a higher level and urging the audience not to get stuck on their individual projects, whether they be Ubuntu or Fedora or KDE or whatever, but to view the whole free software space as a community to which we all contribute and from which we all benefit.

Patents: the impedance mismatch with software

Rob Tiller, VP and Assistant General Counsel for IP at Red Hat covered software patents with a focus on the recent Bilski case, which I covered in a long article. He pointed out that the majority decision took unusual direction: not to look just at legal principles such as what is patentable, but to look at the broad effects of a grant of a patent on innovation. The judges admitted that too broad a patent could hinder innovation, and particularly tried to protect fundamental processes from patents. Tiller called for a coordination of defensive efforts (such as Linux Defenders and Peer-to-Patent) and for incremental changes to patent law that Congress and the courts could conceivably be expected to institute.

After following arguments over patents and writing articles about it for years, I've identified two fundamental ways patents conflict with software practice, particularly open source.

Secrecy versus sharing

Patent law is based on the assumption that innovations are developed in secret, an idea I started to develop in a recent article. If you show off your invention in someone else's kitchen, it has become public and you've lost the right to patent it. (A historic court case turned on that incident.) NDAs are rampant in industry to preserve secrecy until the patent application is filed. A patent is a conduit by which a trade secret can become public information.

Contrast this philosophy with the open source movement, where someone throws a couple ideas up on a blog or releases the first scraps of source code as soon as he can get his thoughts down on a screen. Given the burgeoning research on the value of group discussion and diverse contributions, modern innovative practices may over the long term render the patent system irrelevant.

Software innovation as an economic rather than a technological activity

To programmers, the unpatentable status of software is obvious. A meandering trail of court decisions leading up to Bilski establish precedents that mental processes have to run on a machine (unless it can be shown to have some other tangible result) in order to be patented.

But programmers know that their mental processes never have to run on a machine. I don't need a machine to find the prime factors of a 200-digit number; just let me put a bunch of people to work with pencils and paper for a long enough time. The machine is incidental.

As I pointed out over a year ago, things look very different to people in the world of business and economics where patents matter. They have seen more and more processes that used to be implemented mechanically--and therefore to be unambiguously patentable material--come to be implemented in software. Just look around at your car, your clocks, your thermostat, and see all the formerly mechanical activities being accomplished through software. The time is visibly coming when even the distinction between genetics and software will break down.

So people who honestly believe that patents promote innovation see that software must be brought under the patent umbrella. Technical arguments are secondary to them, and must be circumvented to preserve the patent system.

Some economists are challenging the patent system. It is unlikely to be unseated entirely, but the combination of social and technological changes--if the law recognizes and reacts constructively to them--may relegate it to a much smaller area of research and engineering.

Short takes

Don Marti told me a bit about advances in ATA over Ethernet (AoE), which he hopes will be widely deployed soon in products. Not only could it shake up the storage industry and force competing products to drastically lower their prices--he thinks it could work in tandem with distributed filesystems to create a new age of access to robust and reliable networking storage. What was previously available only to well-endowed companies will now be open to all through free software.

Eric Mandel described the automation of system builds on multiple systems through Cobbler and the better-known Puppet. You can take auto-configuration down to the level of installing and deleting particular user accounts. Once you have a configuration installed on a Cobbler server, you can just connect a bare-metal host and have the configuration running on it in minutes.

I got a nice demo of 3tera's Applogic, whose distinction in the cloud computing area is to provide building blocks such as MySQL and PostgreSQL database servers, NAS storage, load balancers, etc. Applogic is licensed by data centers offering cloud services or by large corporations provisioning their own systems. Customers can specify particular storage sizes, CPU resources, and so forth through the graphical interface, and can even configure auto-scaling when CPU usage or latency hits a certain threshold.

Another company in this space, RightScale, came up in discussions at tonight's BOF on Amazon EC2.

I ended my day with the Weakest Geek quiz show, which dragged a bit but achieved some real suspense toward the end.


You might also be interested in:

News Topics

Recommended for You

Got a Question?