Category: Thesis writing

Thesis writing

Presenting my thesis (again)

A couple of weeks ago, I presented part of my thesis at the Danish open source conference Open Source Days.

In the process of preparing the presentation, I returned to thesis and delved into the material in a way that I haven’t done since I wrote it. It was interesting to see how my own ideas have developed in the light of what I have learned and worked with since finishing two years ago. So I’ve continued working on the presentation even after the conference, annotating and adding to it, and making a more visual, more updated and – hopefully – more easily approachable version of my thesis than just the raw PDF of the whole thing that I’ve showed so far:

Oh, and if you read it – please let me know what you think can be improved. One big part is probably killing some more darlings, so tell me which parts didn’t work for you.

Bit by bit – a review of “Two Bits”

I finally found the time to read Christopher Kelty’s book Two Bits – The cultural Significance of Free Software. Kelty is one of the few other anthropologists studying Free Software in general, and his work has been a huge inspiration in my thesis work on Ubuntu, so naturally, my expectations were high.

As Kelty argues, we’ve been drowning in explanations of why Free Software has come about, while starving for explanations of how it works. Thus, Kelty’s focus is on the actual practices of Free Software and the cultural significance of these practices in relation to other aspects of our lives.

Kelty’s main argument is that Free Software communities are a recursive public. He defines a recursive public as a public “whose existence (which consists solely in address through discourse) is possible only through discursive and technical reference to the means of creating this public.”

It is recursive in that it contains not only a discourse about technology, but that this discourse is made possible through and with the technology discussed. And that this technology consists of many recursively dependent layers of technical infrastructure: The entire free software stack, operating systems, Internet protocols. As Kelty concludes:

The depth of recursion is determined by the openness necessary for the project itself.

This is a brilliant observation, and I agree that the notion of a recursive public goes far to explain how the everyday practices and dogmatic concern for software freedom is so closely intertwined in this public.

The book is divided into three parts, each part using a different methodological perspective to examine the cultural significance of Free Software.

The first part is based on Kelty’s ethnographic fieldwork among geeks and their shared interest in the Internet. I found this to be the weakest part of the book. His ethnography does not cover the actual practices of Free Software hackers, but rather on the common traits among Internet geeks, which certainly supports his argument (that they’re all part of a shared recursive public), but doesn’t give a lot of depth to understanding their motives.

The second part is based on archive research of the many available sources within the various open source communities. In my opinion, this is the best part of the book with both deep and thorough analyses of the actual practices within free software communities, as well as vivid telling of the pivotal stories of “figuring out” the practices of Free Software.

The final part is based on Kelty’s own participation (anthropologist as collaborator) in two modulations of the practices of Free Software in other fields, the Duke University Connexions project, and the Creative Commons. These are stories of his own work “figuring out” how to adapt Free Software practices in other realms. These practices are still in the process of being developed, experimented with, and re-shaped – like all Free Software practices. And this part gives a good idea of what it feels like to be in the middle of such a process, though it offers few answers.

Being a completely biased reviewer, I’ll stop pretending to do a proper review now, and instead focus on how Kelty’s analysis fits with my own study on the Ubuntu Linux community. Kelty argues that there are five core practices, which define the recursive public of Free Software. Kelty traces the histories of “figuring out” these practices very well, and I’ll go through each in turn:

Fomenting Movements
This is the most fuzzy on Kelty’s list of five core practices. I understand it as placing the software developed within a greater narrative that offers a sense of purpose and direction within the community – “fomenting a movement” as it were. Kelty has this delicious notion of
“usable pasts” – the narratives that hackers build to make sense of these acts of “figuring out” after the fact.

In my research, I found it very difficult to separate these usable pasts from the actual history within the Free Software movement, and my thesis chapter on the cultural history of Ubuntu bears witness to that. So I am very happy to see that Chris Kelty has gone through the momentous task of examining these stories in detail. I find that this detective work in the archives is among the most important findings in the book.

Sharing Source Code
A basic premise of collaboration is shared and open access to the work done – the source code itself. The crux of the matter being giving access to the software that actually works. Kelty tells the story of Netscape’s failure following its going open source with a telling quote from project lead Jamie Zawinski:

We never distributed the source code to a working web browser, more importantly, to the web browser that people were actually using.

People could contribute, but they couldn’t see the immediate result of their contribution in the browswer that they used. The closer the shared source code is tied to the everyday computing practices of the developers, the better. As Ken Thompson describes in his reflections on UNIX development at AT&T:

The first thing to realize is that the outside world ran on releases of UNIX (V4, V5, V6, V7) but we did not. Our view was a continuum. V5 was simply what we had at some point in time and was probably put out of date simply by the activity required to put it in shape to export.

They were continually developing the system for their own use, trying out new programs on the system as they went along. Back then, they distributed their work through diff tapes. Now, the Internet allows for that continuum to be shared by all developers involved with the diffs being easily downloaded and installed from online repositories.

As I point out in my thesis, this is exactly the case with the development of the Ubuntu system, which can be described as a sort of stigmergy where each change to the system is also a way of communicating activity and interest to the other developers.

Conceptualizing Open Systems
Another basic premise of Free Software is having open standards for implementation, such as TCP/IP, ODF, and the world wide web standards developed by the W3C – all of which allows for reimplementation and reconfiguring as needed. This is a central aspect of building a recursive public, and one I encountered in the Ubuntu community through the discussions and inherent scepticism regarding the proprietary Launchpad infrastructure developed by Canonical, the company financing the core parts of the development of both the Ubuntu system and community.

Writing Licenses
Kelty argues that the way in which a given software license is written and framed shapes the contributions, collaboration and the structure of distribution of that software, and is thus a core practice of Free Software. Kelty illustrates this by telling the intriguing story of the initial “figuring out” of the GPL, and how Richard Stallman slowly codified his attitude towards sharing source code. This “figuring out” is not some platonic reflection of ethics. Rather, it is the codifying of everyday practice:

The hacker ethic does not descend from the heights of philosophy like the categorical imperative – hackers have no Kant, nor do they want one. Rather, as Manuel Delanda has suggested, the philosophy of Free Software is the fact of Free Software itself, its practices and its things. If there is a hacker ethic, it is Free Software itself, it is the recursive public itself, which is much more than list of norms.

Again, almost too smartly, the hackers’ work of “figuring out” their practices refers back to the core of their practices – the software itself. But the main point that the licenses shape the collaboration is very salient, still. As I witnessed in the Ubuntu community, when hackers chose a license for their own projects, it invariably reflected their own practices and preferred form of collaboration.

Coordinating Collaborations
The final core practice within Free Software is collaboration – the tying together of the open code directly with the software that people are actually using. Kelty writes:

Coordination in Free Software privileges adaptability over planning. This involves more than simply allowing any kind of modification; the structure of Free Software coordination actually gives precedence to a generalized openness to change, rather than to the following of shared plans, goals, or ideals dictated or controlled by a hierarchy of individuals.

I love this notion of “adaptability over planning”. It describes quite precisely something that I’ve been trying to describe in my work on Ubuntu. I used Levi-Strauss’ rather worn duality between the engineer and the bricoleur to describe part of this, but I find Kelty’s terms to better describe the practice of collaboration on a higher level:

Linux and Apache should be understood as the results of this kind of coordination: experiments with adaptability that have worked, to the surprise of many who have insisted that complexity requires planning and hierarchy. Goals and planning are the province of governance – the practice of goal-setting, orientation, and definition of control – but adaptability is the province of critique, and this is why Free Software is a recursive public: It stands outside power and offers a powerful criticism in the form of working alternatives.

As Kelty points out, the initial goal of these experiments wasn’t to offer up powerful criticism. Rather, the initial goal is just to learn and adapt software to their own needs:

What drove his [Torvalds’] progress was a commitment to fun and a largely in articulate notion of what interested him and others, defined at the outset almost entirely against Minix.

What Linus Torvalds and his fellow hacker sought to do was not to produce “a powerful criticism” – those almost always come after the fact in the form of usable pasts to rally around – rather, their goal was to build something that would work for their needs, and allowed them to have fun doing so.

I find that this corresponds very well to the conclusion of my thesis: that the driving goal of the Ubuntu hackers continues to be to build “a system that works for me” – a system that matches their personal practices with the computer. A system that is continually and cumulatively improved through the shared effort of the Ubuntu hackers, each adapting the default system to his or her own needs, extending and developing it as needed along the way. As Kelty writes in his conclusion:

The ability to see development of software as a spectrum implies more than just continuous work on a product; it means seeing the product itself as something fluid, built out of previous ideas and products and transforming, differentiating into new ones. Debugging, in this perspective is not separate from design. Both are part of a spectrum of changes and improvements whose goals and direction are governed by the users and the developers themselves, and the patterns of coordination they adopt. It is in the space between debugging and design that Free Software finds its niche.
(…)
Free software is an experimental system, a practice that changes with the results of new experiments. The privileging of adaptability makes it a peculiar kind of experiment, however, one not directed by goals, plans, or hierarchical control, but more like what John Dewey suggested throughout his work: the experimental praxis of science extended to the social organization of governance in the service of improving the conditions of freedom.

In this way, Free Software is a continuing praxis of “figuring out” – giving up an understanding of finality in order to continually adapt and redesign the system. It is this practice of figuring out that is the core of cultural significance of Free Software, as we continue to figure out how to apply these learnings to other aspects of life. Kelty does well to describe his own efforts “figuring out” in relation to non-software projects inspired by Free Software practices in the final part of the book. Though these reflections do not come across as entirely figured out yet.

All in all, it is a brilliant book. But given its Creative Commons license, it poses an interesting challenge to me: Remixing – or modulating, as Kelty calls it – the book with my own work (and that of others – like Biella) to create a new hybrid, less tied up in the academic prestige game.

(Maybe then I can change the title, because that continues to annoy me: Why is it called Two Bits? Apart from the obvious reference to computing in general, it doesn’t seem to have any other relevance particular to Free Software?)

The Community of Practice on Communities of Practice

Some time ago, I was invited by John D Smith to present my thesis work on Ubuntu as a Community of Practice at the CP Square autumn dissertation fest. CP Square is an online community of researchers and consultants working with Communities of Practice – a term coined by Etienne Wenger and Jean Lave, and which is a central part of the theoretical framework for my thesis.

I gave the online  presentation this evening, and if I hadn’t been so darned busy lately with work and moving to a different commune (more on that in a separate blog post), I would have blogged about the presentation earlier so that you’d all could have had had the opportunity to listen in.

Online in this case means via Skype teleconference  and a community chat channel, which meant visualizing my audience while talking, and linking to images that related to presentation in the online chat (NB: they’re not sorted. It’s a mess. I’ll add my notes to the images soon to give some sense of a sequence). It’s not the easiest of formats – a lot energy and rapport goes lost in the ether. But I thought it worked out well. The participants were attentive and inquisitive while remaining constructive and supportive – a real treat.

Actually, I was surprised to get the invitation. But I’ve really relished the chance to revisit my thesis work. As I reread it, I realised that writing the thesis is only the beginning.

Since I joining Socialsquare, I’ve been working with all sorts of aspects relating to communities online, and it’s been great to return to that the my work on the Ubuntu Community and see new ways to extend my old analyses and apply them in new contexts. But most of all, I’ve come back and found just what a good framing the Community of Practice is for understanding online communities, and I hope to learn a lot more on how to apply it from the CP Square community.

How to write a thesis

Writing a thesis is a difficult undertaking. Before I started writing mine, I hadn’t written any assignment longer than 30 pages (my Bachelor’s essay), and it was quite a step up from that to having to structure a huge complex of data that I’d gathered on my own, analyze it and bring it together in a coherent academic argument.

Luckily, I was well helped along the way by my supervisor, Morten, who really reeled me in from time to time when I was going off in weird and unsustainable directions, which happened fairly regularly. He gave me a lot of pointers, which I have summed up here for anybody about to write a major piece of academic argumentation. It may seem simple enough, but trust me: Once you get involved in it, you lose yourself to the writing, and it is difficult to avoid being overly esoteric with regards to your special niche of interest.

  • Be overly pedagogical! Keep a continuous meta-discourse going to explain to the reader why this bit of information is relevant in the grand scheme of things. It may seem obvious to you, since you know what is coming. But the unaccustomed reader won’t.
  • Use lots of part conclusions! Sum up again and again how each bit of analysis is relevant and necessary to make sense of your overall argument.
  • Focus on readability! Don’t use more than a handful abbreviations that you can reasonably expect the reader to know in advance. Use clear examples to explain difficult terms and processes!
  • Be very careful with descriptive passages. It can easily become either dry or boring or light-weight and irrelevant. Keep your focus on the relevant scientific observations. Those are the ones that you are meant to pursue!
  • Make it perfectly clear to yourself which academic or scientific tradition you aiming to be part of. Are you going for the anthropological insights, or the psychological qualities, or perhaps the computer science bits? There’s no way you can appeal to all, and your thesis will suffer from lack of focus, otherwise.
  • Be analytical: Use quotes or specific data to underline your analyses and conclusions. Hack the data! Fashion surprising and worthwhile points from your empirical descriptions.
  • Write descriptively in order to support your analysis – but don’t write naĂŻvely. The description can be an analysis in its own right if used to expose analytically interesting situations and issues.
  • Use and express clear levels within the text: Who is saying what? When are you being analytical and when are you being descriptive? Use meta-commentary to separate the two, but don’t be judgmental. Try colouring the text so that you can see where you are analyzing and when you are describing. Keep these in separate sections! Otherwise it will confuse the reader!
  • Make clear distinctions between what your informants are saying, and what you are saying: Are you using their metaphors and terms? When are you speaking and when are they speaking? You cannot be reflective and critical when using their terms. Use italics and quotes to signify that you are aware of the difference!
  • Be reflective all the time: Ask: Which implicit assumptions do your informants have that shape their demeanor and convictions? For example: What assumptions are inherent in the idea of the transparency of a computer program? How does this assumption shape relations between people?
  • Focus on the relations between your informants! What does it mean to be part of this group? Is it a group? Where do their shared bonds lie?
  • Pick a theoretical perspective and give it more depth! Illuminate it from different angles through various analytical means. Dig deeper!
  • Use diagrams to illustrate and explain tricky analytical points that you find central. Often, a good diagram will express a thousand words of analysis.
  • Each chapter of the thesis should be a paper in its own right – containing its own analytical focus and conclusion. But at the same time, it should lead on to the next chapter. Ask yourself: How does this chapter lead on to what I discuss in the next chapter? Is there a feeling of natural flow between the analyses?
  • Layout the text as it has to be in the final version. It will make it a lot easier for you to see if you are within the formal word and page limit. Writing too much will require rewriting and cutting, which is arduous and difficult! Better to write it right the first time.
  • Have a draft chapter ready for review for every meeting with your supervisor. Write a letter along with the draft: Describe how the draft fits with the greater whole of the thesis – what function it fulfills. Make it easy for the supervisor to comment it in a way that can help you!

Well, I’m sure this seems like pretty self-evident advice, but it is still hard to remember when you’re getting carried away writing about your very favourite obscure detail about the history of the Unix operating system. And you know it has to go, the moment you finished it…

The thesis is now available

It’s been a long way underway, first through fieldwork, writing, submitting, defending, editing, and polishing. But now, finally. My anthropological thesis on the social dynamics of the Ubuntu community is available for everybody to read.

You can download the abstract, or the full 2.9 MB PDF file.

I’ve released it under a Creative Commons license so that everybody is most welcome to redistribute it and add their comments.

EDIT: My webhost removed the file due to excessive load. But it is now back on-line.

The thesis is now also mirrored at AsianLinux.org thanks to kind help from Anand Vaidya.

And at Software Libre Rudd-o thanks to Manuel Amador. So please use one of those mirrors in case of any further issues with my webhost.

… graduates!

white box, folded

And thus arose the day where I end my association with the University of Copenhagen after almost 7 years to the day.

I defended my thesis this morning with some success, with fun props and pictures to explain my theoretical perspective. And I passed comfortably, though not without being told that there was a distinct lack of methodological discussion, only barely an academic argument, and that it lacked a proper critical approach to the theories I used. Indeed, I was told that I didn’t “unfold” my material properly as there were too many theories in play – several of which which were contrary to one another.

white box, unfolded

All valid criticism, I suppose. In the end, I’m quite happy with the decisions I made, since I emphasized not making only making the field interesting for anthropologists to read about, but also to make it readable and interesting for other people who might be interested in the social dynamics of a free software community. I could have added more reflection on my methods, or focused even more on the analytical crisis cases – but as I already had reached the maximum length allowed for the thesis, I could only have done so by cutting something else.

I’d rather describe the many aspects of the Ubuntu community as they are, rather than focus on crisis cases and dilemmas which are so rare and much less typical of the community as a whole. I’ll digest these comments, clarify a few elements in the thesis and rewrite the conclusion – and ever so soon, I’ll put the “director’s cut” of the thesis up here for all to see.

But for now: Celebration! 🙂

Thesis defense

Since coming back from my holidays, I have been in something of a limbo state while waiting to find out the date for my thesis defense. I couldn’t quite tell if I should be panicky with last-minute preparations or summerly relaxed with plenty of time to spare.

Well, now I know. I got the letter this morning:

My thesis defense will take place at 10.00 am on Monday the 27th of August, at the University of Copenhagen. Please do let me know if you plan to attend.

As to making the thesis available on-line: I’m still tweaking a few bits, and haven’t heard back from some of my informants, whom I’ll have to hunt down. Hopefully, it’ll be up for download by the end of next week.

Thesis done!

My thesis, based on my anthropological fieldwork in the Ubuntu community, is finally done, and I turned it in yesterday.

Since I began writing my thesis, I’ve had this as my background screen on my computer:

Don't go native!

Going Native‘ is losing your reflexive anthropological distance by becoming to closely involved with the field. It is taking on some of the cultural traits of the people you study, eventually reaching the point where you can’t even tell yourself apart from your informants.

Naturally, this is a bad point of departure for writing serious anthropological analysis, and I needed that daily reminder not to jump back in to the digital conversation flow on IRC and mailing lists and continue my direct involvement with Ubuntu, which would make writing this thesis so much harder (and most likely make the end result even worse that it has turned out..)

So I left the Ubuntu community for a while, longer than anticipated, actually, as it seems that Parkinson’s Law (stating that “work expands so as to fill the time available for its completion”) remains in effect.

Thus, I won’t be able to defend my thesis until late August because the University of Copenhagen summer break in July. And similarly, I won’t have time to make my thesis available on-line until then, partly because the people I quote in the thesis need to have a chance to read and approve of the data I make public, partly because I will be on summer holidays almost continuously until the 27th of July, and partly because my computer died recently, leaving me with little means of working on the go.

But as a little (tiny) appetizer, here’s the front page:

Thesis front page

Later on, I hope to sum up some of my experiences writing this thesis. I haven’t really been very good at posting updates on how my writing has progressed, but I suppose that is in part due to the thesis tunnel vision that sort of blocks out everything else.

Until then, enjoy the summer!

Debian as the research library of Free Software

I’ve had a quite interesting discussion with Lars Risan in the comments to my recent blog post on Launchpad about Lars’ paper on the role of technical infrastructure in Free Software development, and I think Lars does well to describe the central tension within these:

I think Debian is a tremendously interesting case when it comes to understand human cooperation. Because, not unlike what Eric Raymond pointed out long ago, there is the â??Bazaarâ? of Debian, the heterogeneity of the network. And there is the â??Cathedralâ? of it: The ability to structure the large amount of work to the degree that you can slip a Debian DVD into a Windows-computer, and turn it into a Debian computer in less than an hour. How far can you take this mix of â??cathedralâ? and â??bazaarâ?? How much bazaar can you have before it forks, and how much central control can or must you enforce upon the network? How can you build a system that enables the â??beauty of (hacking) mankindâ? to simply do good, and/or how much must you invoke something like the (partly fantasy figures) of the Ubuntu-Launchpad and the Debian-Cabal?

This much-discussed question of centralization connects well with the stated goal of World Domination that many Free Software communities state again and again in a typical “ha-ha, only serious” fashion.

Global domination is a centralistic quest to compete with the cathedrals of Windows and Mac OS, and to do this while maintaining the freedom of learning and technical exploration that is the essence of Free Software will inevitably be a balance act. It is one that all Linux distributions are performing with various degrees of success.

One way is, as Canonical proposes, to let companies define the supported Free Software that the end user might need, and guarantee that it is available and that it “just works”. Thus creating small pools of centralization – the specific traits of Ubuntu – merging with the larger decentralized pool of shared knowledge within the bazaar – Debian Unstable every six months.

Mark Shuttleworth describes this relationship as Debian as “the Tibetan Plateau of the free software landscapeâ? upon which Ubuntu is built:

By contrast with Debianâ??s Plateau, Ubuntu is a cluster of peaks. By narrowing the focus and allowing the KDE, Gnome and server communities to leverage the base of Debian without treading on one anotherâ??s toes, we can create a K2, and a Kangchenjunga and a Lhotse. Ubuntuâ??s peaks depend on the plateau for their initial start, and their strong base. Ubuntu needs to be humble about its achievements, because much of its elevation comes from Debian.
[…]
Many people have asked why I decided to build Ubuntu alongside, or on top of, Debian, rather than trying to get Debian to turn into a peak in its own right. The reason is simple – I believe that Debianâ??s breadth is too precious to compromise just because one person with resources cares a lot about a few specific use cases. We should not narrow the scope of Debian.

At last year’s DebConf, a talk compared Ubuntu’s use of Debian to shopping at a supermarket, a one-stop shopping for your free software needs. This innocent analogy sparked a lot of discussion, with many of the Debianistas arguing that this reduced Debian to a giant collection of un-integrated software – which wouldn’t be very interesting at all. As Debian Developer Joey Hess put it:

My main motive for contributing to Debian is to make Debian the best distro I can; I don’t mind if others use that work, especially if stuff gets contributed back. But it’s long been clear to me that the most important added value to Debian is not adding another package to the shelf, but finding new ways to integrate our software together. When you’re working mostly above the level of individual software packages, to have your work mostly appreciated on the basis of “component contained in Ubuntu” is not very motivating.

Integrating and creating a universal distribution is still a central motivation for many Debian developers, who may find that Ubuntu is not only stealing their thunder but also compromising their ideals, again with no small amount of distrust towards the dealings of the opaque corporate entity that is Canonical.

But as Shuttleworth also notes, that when you seek narrow goals, it will come at the cost of openness and democracy. The Debian community is obviously unwilling to make such a trade, and as such, must look elsewhere for inspiration on how to organize themselves towards their goal.

Taking a cue from an earlier comparison of the Free Software communities to the scientific communities, I think it would be worthwhile for Debian to leverage their solid technical infrastructure towards becoming commonly agreed upon open standards to allow the exchange of knowledge and code will be possible on equal terms. In that way, I think that Debian is quite well poised to become a sort of â??research library of Free Softwareâ? – collecting the â??monographsâ? and â??articlesâ? of code, cataloguing and organizing them for easy and open access.

Almost all Debian developers, and apparently 76 % of all Debian users,use Debian Unstable or Testing.

Debian Unstable, codenamed Sid, is the latest, most volatile version of Debian, where new packages are uploaded to, and changes to old packages are added on a daily basis. It contains all of the latest releases of the upstream communities, packaged and categorised. Debian Testing contains all of the packages that haven’t had a critical bug filed against them after spending 10 days in Unstable.

Only 24% of the Debian users use Debian Stable, since the releases are so rare as to only come out every 2 or 3 years – much too rarely to keep up with all the latest hardware and desktop applications, making Debian Stable relatively unattractive in the long term.

Since Debian Unstable (or Testing, for the less adventurous) fulfills the needs for both the users and the Debian developers, there is little incentive to make an Upstream Version Freeze and actually get a new release out, and since Debian is solely based on volunteer work (with suggestions to pay release managers meeting fierce opposition), bug fixing and release deadlines aren’t enforced that well, further slowing the actually release of the Debian Stable.

As Shuttleworth points out, Debian Unstable is “not subject to the same political tradeoffs that are inevitable when dealing with releases, architectures, dates, deliverables, translation, documentation and so on.” – it is the core strength of Debian.

As the developers I interviewed said again and again when asked what they get out of contributing to Free Software projects like Debian: “I get a system that works for me.” To some extent I think that is what Joey Hess’ means when he says that his main motivation for contributing is “to make Debian the best distro I can.”

Debian Unstable is not merely about keeping software up to date, but also about improving it, further integrating it within the Debian infrastructure. It is a research library in the way it is not only focused on gathering knowledge, but also on using it. It is also a laboratory where Free Software is shared and combined in new and interesting ways.

Maybe that would the most fitting analogy to what Debian really is: A research library containing all of the Free Software that lives up to set strict criteria. Offering all the development tools, libraries and applications up-to-date for new experimentation.

This would fit well with the Lars’ counter-argument:

Science is (also) extremely disunified (there are 22.000 different medical journals in the world. â??Medicineâ? (in singular) is known by no-one). And â??scienceâ? can live with this relative disunity. Can Debian? There is at least one difference: â??Debianâ? as a whole must occasionally come together to produce a release. A release which is unified enough for a computer to work. Fuzziness is simulated by computers, because they work only by strictly separated ones and zeroes. The knowledge of Debian, packed together as a release is pretty much a unity. A unity of which a large degree of coherence is required. The knowledge of science may be unified in an encyclopaedia, but, metaphorically, the discrepancies and glitches of that body of knowledge is so enormous, so â??buggyâ?, that it would make any computer halt very early in the boot.

Much like the knowledge of science only can gathered for release in an encyclopaedia that is outdated before it is even published, all of the software within Debian will also be outdated by the time it is released. Perhaps Debian should take inspiration from Wikipedia rather than other Linux distributions.

Wikipedia is a reservoir of such diverse knowledge that it would never be in a position where it could be published in paper form. It would require an insane amount of work, flamewars and unhappiness for all involved. It would require a centralization that simply is not present in the Wikipedia community.

But in spite of this, by far the biggest part of Wikipedia is eminently usable and accessible at any given time. As is Debian Unstable.

Perhaps the Debian community should take some comfort in this.

Why Launchpad isn’t taking off just yet

Lars Risan, a Norwegian anthropologist leading a group of researchers at the university of Oslo studying “The Political Economy of Free/Open Software” recently put up an interesting blog post about the Launchpad technical infrastructure’s effects on the relationship between Ubuntu and various upstreams, both with regards to Debian, but also with regards to the translation work done through Rosetta as opposed to directly in the upstream.

Risan raises some relevant issues that have received much discussion within the Free Software communities around Ubuntu, namely: What is Canonical’s intended purpose with Launchpad, and why isn’t it Free Software?

He finds that the main lines of argument revolves around the fact that Launchpad is Canonical’s flagship investment, and that the promises of freeing the Launchpad source code will only be kept once Canonical has secured the market for Open Source infrastructure, and that Canonical much like Google seeks to trade in free web services to profit from the unhindered access to the data – the translations, source code, bug reports, specifications and support tickets handled by the system.

While Mark Shuttleworth’s reply to these claims emphasizes that he has no problem if people prefer to use something like Pootle instead. And he concludes:

One thing I can say, though, is that a web service (or even a remote app service) can never create the same level of pain that a proprietary OS can do. Having watched what Microsoft has done, I’m largely motivated by a desire to ensure that countries like South Africa never have to pay a tax like that again.

And that’s fine. Like Google, Launchpad is intended to provide a service which you can choose to use or not. But as Risan points out, the service still isn’t very good. In his case study of how Rosetta works, he concludes:

At the moment, then, Rosetta seems not to be Adding Value ™. It is just adding mess. Neither is it evil. It is just bad.

As Risan does well to show in his paper, it is not a matter of whether Rosetta technically offers the necessary capabilities, but rather whether the infrastructure can work with the various upstreams – in Risan’s case the Norwegian translators of KDE – to make sure the latest translations are available in distributions like Ubuntu which depend on Launchpad for its translations.

The problem is that with Risan’s translations, Rosetta has simply supplanted the Norwegian KDE translators as the translation upstream, thus actually segregating the community rather than uniting it. And the reason for this may exactly be the Norwegian KDE translators’ hesitancy to drop Kbabel and other tools for Launchpad and Rosetta – a platform which many Free Software developers still do not entirely trust, nor will be willing to use until it becomes Free Software itself, or at least until it becomes so good that it would be too much work to build an alternative – like Google.

In this way, the adoption of Launchpad continues to be slow, not because of any bad intent from its architects or lack of interest from its potential users, but because it has been built without consideration for the social connections within and between Free Software projects.

Indeed, Launchpad is often described as seeking to “automate social connections between projects” so that patches and data can be exchanged as smoothly as possible with a minimum inter-community flame-wars (this fits quite well with Ubuntu’s relationship to Debian, where a number of developers continue – rightly or wrongly – to be unhappy with the patches that Ubuntu send back upstream).

When the project started, the Launchpad developers mapped out all the software repositories of the Free Software world and linked them together, but they did not map out the flow of information between these repositories or how its active inhabitants collaborate. Thus Launchpad does not reflect how the upstreams work, limiting their willingness to adopt it, and since they can’t customize it to fit their needs as they would otherwise do – the source code is still unavailable, after all – they simply stay away.

Unlike Google – whose services generally are so easy to use that they require little or no customization, Canonical’s Launchpad is a intricate behemoth of details. Even core Ubuntu developers who use it everyday do get lost in the system from time to time. It cannot be optimized for a single use case, since Free Software projects, though they appear much alike, have subtly and vastly different ways of collaborating – both due to community structures and dynamics, but mostly because of the many different tools they use.

Having the data is not enough. Understanding and incorporating the work flow of the upstreams is also necessary. And nobody will be able to do that better than the upstreams themselves.

This constant negotiation between the technical and the social is the main theme of my thesis, and though I can’t delve into the entirety of Launchpad, I do hope to elaborate further on some of these ideas about the role of technical infrastructure in Free Software projects.