SCALE has been growing over the years, taking attendees who I suspect used to attend LinuxWorld/Open Source World. I just flew in from Boston to take in the conference for the first time. I was rewarded with a prep on libproxy by developer Nathaniel McCallum, along with other tidbits from a day filled mostly with tutorials and community meetings for free software projects. (The main part of the conference is today and tomorrow; come on down if you're in the area.)
libproxy:fixing what sucksThere's a wealth of proxies for the web, email, FTP, and so on. In addition to enforcing corporate policies and providing some application-level security, proxies can be valuable for reducing demands on Internet infrastructure. How often has someone in your organization gone wild over some video and sent an email blast around the office saying, "You've got to download this!" When the next 200 employees click on the link, wouldn't it be better if it was already in a cache on the local network?
But application support for proxies is inconsistent. Some applications aren't written to know that these intermediaries could be in place at all. Many others use hand-crafted systems for getting and storing user proxy preferences. And some use a flawed protocol called WPAD that was rejected by the IETF as a real standard but has turned into a de facto standard because Microsoft uses it.
GNOME and KDE have tried to introduce a degree of standardization, offering robust but incompatible proxy-finding protocols. It would be a relief to reduce the diversity in implementations from thirty or forty applications to just three or four desktops, but they would still be idiosyncratic and some applications wouldn't work everywhere.
According to McCallum, proxy application APIs are a neglected step-child of the free software family because developers tend to work in places that don't install proxies. They don't have to struggle with the problems faced by the vast majority of other desktop users.
Not McCallum. He and his colleagues have done a pretty exhaustive study of application support for proxies. He pointed out some of the more subtle problems with current implementations.
For instance, ad hoc solutions (such as checking the http_proxy environment variable) may be enough to get an application started. But on modern laptops and mobile devices, an application lasts longer than the network connection. If you move to a new location, you may find that you have to restart the application so the new proxy can be found.
This is a classic problem crying out for standardization, and libproxy tries to fill that gap. It provides a fixed configuration format, location, and protocol for applications to get user preferences. On the back end, it provides plugins for common environments. GNOME has started to work with libproxy, for instance, and OpenSolaris developers plan to do so.
Migrating to libproxy simplifies the user experience as well as application development. Existing applications will have to be rewritten, but for an interface that provides only three calls (two of them without arguments), how much trouble can that be? And if new and better protocols for autoconfiguration and credential exchange are introduced in the future, they can be widely adopted simply by writing a new back-end for libproxy.
The next release of libproxy (0.3) will store credentials and pass them to the proxy. The ultimate goal is quick and efficient connection method that passes a socket to the application. libproxy currently works on Linux, but Mac and Windows versions are on the way. It currently has interfaces for C, .NET, and Python. And of course, libproxy is a typical shoestring project that can always use more volunteers.
Open source for educators
Friday's keynote talk enumerated the advantages of free software for schools. Naturally, the first thing that pops into the minds of the general public when one mentions free software is reduced costs, and it would be a disservice to deny that it's a major benefit to schools, who could do a lot better with their money than spend it on license fees. (This point takes on additional poignancy in California, whose public schools are facing an enormous budget deficit right now.)
But the keynoter (whose name I didn't get, unfortunately) took the discussion of costs to a new level, suggesting that teaching could change significantly when everybody had regular access to computers. And he pointed out other advantages of free software. For instance, in proprietary environments, how rarely does everybody have access to the same software, and how much more rarely are they all running the same version? Standardization in a free software environment is much easier.
Furthermore, schools should prepare students to deal with workplaces that have free software. According to a Gartner survey, 80% of companies already do. Such software may be on servers and in the back office, but it's still a force employees have to deal with sometimes.
I should mention that free software in the education field is one of the major themes of this conference. Women in free software is another.
How system administrators could better document their practices
I dropped in on a session about internal documentation led by Chris St. Pierre and wished I could have stayed the whole time; the subject interests me because of my own search for better project documentation. One of his suggestions--to keep all files under revision control, including documentation--was conventional. His recommendation to do all documentation on a wiki was more intriguing.
The most interesting idea in the part of the talk I attended was St. Pierre's "hierarchy of documentation needs," a riff on Abraham Maslow's famous classification of human needs. In such a hierarchy, you want to start at the bottom and move up: you must meet the lower needs to create the foundation for each of the higher needs. From the top (least critical) to the bottom (most critical), the sysadmin's hierarchy is:
- Beauty (well-edited, nicely formatted)
- Consistency (provide TOCs, a recognizable structure like man pages, cross-linking)
- Maintainability (versioning, a format that presents a low barrier to entry for editing)
- Accessibility (present to users as web pages)
- Existence (get it down in words!)
I would try to fit findability in there as part of accessibility, and improving quality as perhaps a new level in the hierarchy.
TCO for open source
I spent a few minutes in this BOF. The presenter said it's difficult to find case studies with hard data, although one member of audience said he had seem some. In contrast to Microsoft claims that tech support is more expensive with open source software, one attendee said that tech support costs nearly vanish in places he's worked. (I suspect that, given Microsoft's devotion to integrating its products, support for their software is probably pretty easy so long as you use it the way the developers expect and don't try to tie in non-Microsoft packages.)
Another attendee reminded us to consider the costs of maintaining systems that can access archived data, which would be much cheaper if it's stored in an open format.
But in some environments, open source does add costs. And converting from one environment to another always introduces costs. (This is why I saw one study, which I don't remember, that said converting to open source raises costs over the first five year period but then significantly reduces them.)
Finally, I attended a session on OpenSolaris. Intel has contributed a lot to this project, which helps Sun deliver binaries for its processors. Another interesting point made by a system administrator is that ZFS can encapsulate lots of configuration information that is usually scattered among multiple files, so moving around disks or operating systems is easier.