Monthly Archives: February 2006

Web 2.0

I never really took time to read up on this talk of the Web 2.0, but now there’s an article celebrating the new wonderful things made possible with the AJAX scripting and all the other stuff that makes it so new and exciting.

I haven’t actually used any of the sites they list in the article. They’re mostly social tools for your own personal use such as calendars or startpages, or pages for sharing bookmarks videos, photos or texts. Or . The Netvibes page does look rather cool with its movable and customizable boxes.

And that seems to be the main gripe against all of this talk of Web 2.0. A lot of techies frown upon the use of the term “2.0”, since the amount of new features are severely limited, and that this Web 2.0 revolution is mostly social, rather than technological.

It seems that the whole “Web 2.0” is generally meant to imply that the web is evolving rather than being revolutionized. People are beginning to develop distinct habits on the web, customizing and personalizing content to a greater and greater extent. This is probably best seen on the MySpace websites.

But despite the (supposedly) good intentions behind term “Web 2.0”, it does seem forced, tired, and something that is meant to impress investors. With the bubble still in recent memory, there is good reason to frown at this kind of marketing glossing, and its tired rhetoric has been mimicked quite hilariously.


Also, I haven’t played World of Warcraft beyond a few hours to try it out as part of my Digital Rhetorics course, but when you read an article like this, you realize just to what degree people are immersing themselves in the game, and how others are perfectly willing to abuse this. Scary.

Anthropology and the Muhammed cartoons

Last night, I went to attend a debate at the Department of Anthropology on the much-discussed Mohammed-cartoons. It focused on the anthropological perspective of the reactions and counter-reactions to the drawings and how anthropological theories can help win broader understanding in the current situation.

Now this is a rare thing. In all my time at Copenhagen University, I can’t remember more than one similar meeting that sought to discuss current affairs in an anthropological perspective (that was September 11th). And it was immediately clear that this is the sort of thing that really brings out the Danish anthropologists. The lecture hall was packed with anthropologists and anthro students – with the usual 8 women for every man present, underlining to just what degree anthropology is a women’s field these days.

For this occasion, there were 5 anthropologists (all doing working at the department) who had prepared short presentations on which the discussion were to be based.

First up was Inger Sjørslev who discussed how language is determining the value and sacredness of certain topics. And she went on to discuss how there are things which are sacred outside of the religious sphere of things. And how this silence surrounding things perceived as sacred creates even greater tension when it is broken.

Second presentation was by Morten Axel Pedersen (my advisor!) who argued that what he called “classic” globalization theory such as the idea of five “-Scapes” (ethnoscapes, technoscapes, finanscapes, mediascapes, and ideoscapes) as expounded by Arjun Appadurai aren’t capable of describing or containing a situation as the one we’re facing now, where a crisis has made a jump of scale from a local to a global scale. He argues that we need new theories of globalization to describe this new phenomena of jump of scale (in Danish “Skalahop”).
He argued that this crisis has led to a collapse of not only foreign and interior Danish politics but also in anthropology: There is only one big global field – but that field contains several scales, levels and horizons – much like the segmentary tribe structure described by Evans-Pritchard.

He then called for research into the way that these jumps of scale occur and how the different sides of the conflict are constructing themselves and their opponents. He referred to an article by Ole Wæver, a Danish professor of Political Science, who have argued that this is not so much a clash of civilizations and religions as it is a clash between a secular and religious fundamentalism. And he ended by problematizing the fact that we as secular anthropologists only represent one half of this debate.

Third up was Hans Christian Korsholm Nielsen who has done fieldwork in Egypt and actually was in Egypt as the cartoon case grew around him in late January and early February. His presentation was the other extreme of anthropological discourse compared to Morten’s very theoretical and abstract talk. HC talked in anecdotes, small situational references to discussions he’d had with his informants as the crisis grew. And he gave a sense of how these cartoons grew to be the central topic of discussion within a week, propelled by media and the Friday prayer. He also noted that most Egyptians were shocked but not stupid: They all called for the need for good manners in this kind meeting of cultures.

Fourth was Anja Kublitz who’ve been doing fieldwork among Palestinians in Denmark since September last year when the cartoons were published for the first time. She talked about that last week of September as being a really bad week for her informants:
– first: there was the presentation of the Danish ministry of culture’s new culture canon which was introduced in such a hostile tone (one of the central arguments for this canon was that it was central to mark certain values as Danish in order to counter the tendency of “another Denmark” with its muslim ways). Anja’s informants quickly began calling this initiative for the “culture cannon” to reflect this war-like rhetoric.
– second: there was the case of Louise Frevert, MP for the semi-rascist Danish People’s Party, who had published clearly rascist texts on her personal webpage. She denied this, and the whole thing turned farcical when she excused herself by blaming her webmaster, a retired navy colonel.
– third: the Mohammed cartoons themselves. Which Anja’s informants found to be just the last element in a long row of Danish discriminatory initiatives against muslims in general.

What was worse was that all of these things took place during the muslim holy month of Ramadan – which neither Louise Frevert, Danish minister of culture Brian Mikkelsen, nor Arts editor of Jyllands-Posten had been aware of. It was basically like dissing Jesus on Christmas eve – at the point in time when people of a given religion are at their most religious.

Still, the muslims in Denmark wanted to show that even though they’d been hurt by all of this, they were still very much willing to live in Denmark, to work towards reconciliation. They arranged a demonstration for peace and for tolerance which ended at the central square in Copenhagen where they had a prayer for peace.
The people arranging the demonstration had been very concerned that the demonstration would be looked upon as something aggressive and had made sure that only 3000 people attended – even though they could have mustered maybe 10000. They didn’t want to intimidate the Danes. The demonstration took up only one lane of traffic and even stopped at all the red lights in order to create as little inconvenience as possible. Yet, even so – Danish media and Danish passersby managed to mis-interpret the entire thing.

The slogan for the demonstration was “Islam er fred” (Islam is peace), but because not all of the participants spoke fluent Danish, some misheard this as “Islam er vred” (Islam is angry). A very unfortunate misunderstanding. Further, the demonstration ended with a prayer for peace – because, as the muslims reasoned, “a prayer is the most peaceful thing imaginable”. Yet most of the Danes misunderstood this public act of faith, and one passersby even asked: “Are you going to war?” As the whole act of prayer seemed so demonstrative and foreign to him.

In general, the Palestinian informants don’t see the cartoons as something especially bad – but they’re simply a symptom of the bad climate for understanding that there is in Denmark today.

Finally, the was Mikkel Rytter who talked about how this situation has come about in Denmark. He created a timeline beginning in 1991 with the fall of the iron curtain and how US foreign policy now needed a new enemy to focus on. They chose islamic terrorism which was supported by the “self-fulfilling prophecy” of Samuel Huntington’s “Clash of Civilizations”.
In Denmark, nationalism rose to a new high in 1992 with the referendum against the Maastricht treaty and the victory in European football Championships that summer.
In 1995, the Danish People’s Party was founded to capitalize on those currents of nationalism and localized fear, and it was well-supported by the 1997 campaign called “the Strangers” which ran in the Danish tabloid daily Ekstra Bladet.
Rytter argued that we in this way can see the effect of the globalization in Denmark: As a negative and scary effect resulting in an unknown inner enemy – the muslim immigrants – as seen by a majority of the population whose only contact with these immigrants is through the media. The positive effect of globalization as seen and experienced by the cultural elite is less in focus in this period.

This polarization becomes central in Danish interior politics. The liberal-conservative government that is elected in 2001 uses the warlike rhetoric of a “battle of values” and a “battle of culture” while Danish People’s Party compares their struggle against immigration to the Danish resistance in WWII. This results in a basic dichotomy between modernity and traditionality, betweeen Danes and muslims (disregarding the fact that many of these muslims are in fact Danish citizens).
Rytter argued that many Danes look upon the muslims as an anachronism – and he suggests that we challenge this dichotomy through solid ethnography – to explode these notions.


Based on all of this, a discussion ensued which had a fair few interesting insights and few more anthropological anecdotes. Especially the questions of how to study and represent religion, how, whether and if anthropologists should take part in the public debate on the matter, and how this would impact the anthropological field as such. A conclusion was that is more important than ever to study nationalism and religion – things that are easily exotic to us cosmopolitan anthropologists – not only abroad but also in Denmark. And that is also relevant to study how virtual media and telecommunications play part in the escalation and “scale jumping” of crises like the current one.

Programming 102a

Writing a computer program is somewhat similar to writing a recipe. You need to do things in a certain order in order to ensure success.

In Object Oriented programming such as Java, you focus on writing each chunk of code as a separate object. When you write a chunk of code like this, you shape these objects by assigning them

a) attributes, thus defining their state – which you can use to decide when the object should interact with the other objects, and in what ways it can be affected (in can’t be affected in a way that you haven’t defined).

b) behaviour, thus defining the way in which the object can act. This is called the method of the object. An object can contain many methods, and these are generally used to change the state of one or more objects.

But when you write code, you don’t write objects as such – you write classes – classes are the blueprints, the templates based on which the objects are instantiated. You can think of a class as a little car factory all rigged to produce cars according to a certain blueprint, and the objects as little cars produced by the factory.

In short, the essential bit is that objects are realisations of classes.

In a programming context, you can think of database – eg. a bank’s list of accounts. Here there is a pre-set bunch of information that is necessary for each account. In the bank computer system, the template account is programmed as a class containing certain attributes: Name, address, balance, birth date – and so on.

Instead of having to rewrite the whole class everytime a person wanted a new account, Object Oriented Programming allows the system to just instantiate a new object of the account class, and fill in the necessary information.

With me thus far? If not, try looking at the last 4 slides from this presentation.


As I said, each object consists of methods and attributes (and probably some more stuff they haven’t told us about yet).

Methods are used for grouping and naming sequences of statements (or commands) so that it is easy for the program to call upon that sequence merely by the method name.

Methods can be used as input, eg.:

System.out.println(“Hello there!”);

“System.out.println” is the call to the method “println” which happens to be located in another class named “System”. Thus, everytime you want to call any of the methods from the “System” class you will have to refer to that class by writing “System” before the method you want to use. The method “println” accepts the argument entered in the parenthesis and prints that on the screen.

Methods are always called with zero or more arguments or parameters, but that basically means that you always have to put the parenthesis there – even if you don’t want to call any parameters.

Finally, there’s the semi-colon at the end of the line. Java syntax demands the semi-colon in order for it to treat the line as a statement. It is in fact the semi-colon that turns an expression into a statement – and thus it is part of the statement.

Another central part of Java (and presumably most other programming languages) are identifiers. All a program’s specific elements that is defined by a programmer must also have a name to separate and identify them from the other parts, and, hopefully, indicate the proper use and idea of the element. The identifier is the name of any programmer-defined element. In this example, “Happy” identifies the specific class that writes “Oh, happy day!” on the screen:

public class Happy {
public static void main (String[] args){
System.out.println(“Oh happy day!”);

There are some rules on the naming of identifiers:
1) The first character must be Java letter (these are lower case a-z, uppercase A-Z, $ and _ and few more).
2) The rest of the Identifier can contain any and as many Java letters and Java numbers (the numerals 0-9), as long as there are no whitespaces.

Java convention also demands that when classes and packages should always be named with an uppercase letter (as with the “Happy” above), while attributes, parameters, methods and variables (more on these later) are written with lower case.

If you want to give an identifier a name with more than one word, you write the first letter of each new word in Upper case – eg. “HappyDay” (a class) or “labelColor” (an parameter).

The rule for identifiers is that they should be short, concise and easily recognizable and distinguishable.

Another central part of computer programming is the literals – these are the atomic units of information used in the programming language, and they’re used to supply specific data to programs – often as variables. A variable is a named slot or “box” in the computer’s memory that can be assigned a literal value of some sort. There are several types of literals, and when you assign memory you (“open a little box in the big warehouse of the computer memory”) need to declare which type of literal you want to assign to that slot (“what kind of box you want, depending on what type you want it to contain”).

The rules of Java demand that each variable has a specified type. Other programming languages are more lax about this sort of thing, but Java has a “strong syntax” that require all variables to be declared.

int height;
double weight;
String name;

Where “int” and “double” signifies the types of variables, and “height” and “weight” are the names (identifiers) of the variables.

Thus, the type signifies the type of “box”, while the identifier signifies the name of the “box”. Whenever you want to get something from or put something into the box, you need to call it by its exact name.

So what types of literals are there? What kind of information can we store in our boxes of memory?

There are many. And they vary depending on how much space they take up in the computer memory. It is worth remembering that computers used to be have very little memory, and that it was necessary to worry a fair bit about “memory management” – ie. the way that the computer stored relevant data in its memory. This is less relevant today, but is still worth noting.

So far, we have learned about the following types:

Integers are whole numbers (ie. without any decimals) such as 1, 43 and -17.

There are 4 types of integers – which can contain different sizes of numbers depending on how many bits (short for Binary digIT – The smallest piece of data (a 1 or a 0) that a computer recognizes) it takes up in the computer memory. Bigger numbers take up more memory:

byte – 8 bits – can contain values from -128 to 127
short – 16 bits – can contain values from -32768 to 32767
int – 32 bits – can contain values from minus to plus 2.147 billion
long – 64 bits – insanely large numbers. No, really: Seriously big.

Just to complicate matters further, computers aren’t limited to the conventional decimal numbers that we’re used to. As noted above, the basic atomic unit of the computer is binary, and it does octal (0-7) and hexadecimal integers (0-9,a-f) with similar ease. I find these different base-2, base-8 and base-16 count systems to be a serious mindfuck but they are a central and integral part of computer science. Anyway, back to the types:

Decimal numbers:
Are fractions. These are numbers like 0.1 or -4.5 or 438.75 or 3.14159.
They come in two variants, depending on how precise you want your fraction:

float – 32 bits – 7 significant digits
double – 64 bits – 15 significant digits

Due to the way that computers calculate fractions – something called floating point calculation – fractions and decimals are always approximations. Though if I claimed to understand why this is, I would be lying through my teeth.

Are single keyboard characters such as a lower case ‘a’ or an uppercase ‘A’, a ‘C’ or a ‘ ‘ (a whitespace) or some such. All of these characters have matching coded numbers in a specific system. In the widespread ASCII system, the upper case ‘A’ is represented by the number 65, while lower case ‘r’ is represented by the number 114.

ASCII is limited in that it only has room for 128 different characters (cf. the memory limitations discussed above), so nowadays most people use the Unicode system instead, which has room for 65535 different characters – plenty of room, not just for the Danish ?, ? and ? but for all of the world’s major alphabets such as Greek, Arabic, Cyrillic, Hebrew, and all sorts of far-Eastern variants. So far we’ve learned of one character type, called

Char – 16 bits

Each Char memory slot or “box” can only contain one character – which is in fact their unicode numerical representation.

Boolean Values:
Are truth variables. A simple 0 or 1 – True or False. The strange name is apparently in honour of British mathematician George Boole and his “Boolean Algebra”. Again, esoteric mathematics are lurking in the background of the programming jargon though it isn’t necessary to understand the theory in order to be able to use it. We’ve heard of one kind of Boolean type thus far:

boolean – 1 bit (presumably)

Are strings of arbitrary characters. Whereas the other types are so-called “primitive types”, a string is something of a different categories of which I still know next to nothing. A string is always encapsulated in quotes, like this: “Hello World!” or this: “Hell yeah!”
Strings are also different in the way that when you declare them, you have to capitalize the first letter like a class:

String – size varies?

With all these different kinds of types – or boxes – we can assign values to these variables. Assignment is much like putting stuff in the boxes. It works like this (based on the variables we declared in the example above):

height = 180;
weight = 78,5;
name = “Carl Smart”;

The variable name is always on the left, and the assigned value is always on the right.

You can declare a variable and assign it a value in one statement. This combination of declaration and assignment is called Initialisation:

int height = 180, age = 45;

This is a Good Thing to do, as computer memory is rarely completely empty. When you allocate a bit of memory to your “box”, it doesn’t automatically clear out whatever was in that memory beforehand and it might still contain some random value. Since variables can be changed several times within in a method, it is good to ensure that it contains exactly the value that you want it to contain.

It is also possible to declare constants – which work just like variables, except that they’re not, you know, variable. You declare a constant by prefixing your declaration with a “final”, like this:

final double PI = 3.14159;

It is Java convention that constants are written in all-caps so that you easily can set them apart from the variables.

Another central part of any programming language is the expressions. These consist of operators and operands.

Operators are basic arithmetical operators:
+ addition
– subtraction
* multiplication
/ division
% remainder of a division (if you divide 123 by 10, you’ll have a remainder of 23).
= assignment (as we saw it done above)

Operands are the values of other expressions. The values in the boxes.

The operands are the input and the return value is the output of an expression.

Most often, an expression won’t change the value of a variable (unless you actively assign the variable a new value), but will just read off the value. “Think of them as references, for the most part.” Our lecturer said.

A typical expression would look something like this:

1 + 1;


weight / (height*height) = bodyMassIndex;

Where you assign the result a the expression to a new variable. Though the result would only would be correct if the integer operand “height” is assigned a value in meters, which you probably wouldn’t want to do as that would make most people either 1 or 2 meters tall – with no decimals allowed.

But since the “weight” operand is a double, it is necessary to “promote” the other operand “height” to double status for as long as it takes to make the calculation. The result will be presented as either an integer or a double, depending on what type the new variable “bodyMassIndex” has been declared.

Having introduced classes, methods, identifiers, primitive types and expressions – this is probably a good place to stop, even though this is only half a lecture.

I’m absolutely amazed at the amount of new concepts and ideas that is introduced in programming. Even though it only requires basic math to use, there are so many layers of culture and tradition built into these languages that just trying to explain why it is this way is a solid effort.

One thing is understanding it, another thing is actually writing it and thinking in the proper manner. It is a quite fascinating thing to do. More about this soon.

Programming 101 – addendum

Before I go with the lecture for week 2, I need to go back and explain something about compiling. When you compile your textfile with the .java extension, you usually do this from a Command Line Interface (CLI) or from an Integrated Development Environment (IDE).

The CLI is a basic part of most operating systems (the old MAC OS 9 being the exception) where you can enter commands directly to the computer. This was the old one and true way to interface with computers in the olden days, and you can read a homage to CLI here.

The way to use the CLI to compile a program would go something like this:

Just like writing the actual code, using the CLI isn’t really very friendly.
You have to know the syntax of the command line as well. In the highlighted Step 2, You use the program “javac” to compile your textfile “”.
In the highlighted step 3, you use the Java Virtual Machine “Java” to run the newly-compiled “FirstApp” (apparently, there’s no need to add the extension “.class”).
As you can see, running the program results in the computer printing the message “Hello CS 170” on the screen.

In Computer Science tradition, it is a virtue for the computer to give as little “noise” as possible. Therefore, as long as the program executes correctly, it offers no more information than is demanded by the program being run. In this way, the only way we know that the program compiled correctly is by the fact that we regain the command prompt and that no error message occurred.

The compiler isn’t really very helpful when it actually comes to error messages, anyway. And when you build larger programs consisting of many small bits and pieces, it’s easy to lose track of them all on the command line. Even so, a lot of purists, including my friend Stefan, use just a text editor (such as the wondrously complicated Vim and Emacs) and a compiler directly from the command line.

Thus the IDE:

The IDE contains everything you need to write, compile and test your code. It uses a graphical user interface to seperate the different elements. There’s a text editor (on the left) and a project overview (on the right) that makes it easy to keep track of the different bits of code, and whether they’ve been compiled or not. You can compile a chunk of code simply by pressing the “compile” button, and run it by pressing the “run” button. Deceptively easy.

There are many different IDE’s made for Java. Apparently, Eclipse is the most popular one as it aims to be a universal IDE (ie. it also works with all sorts of other programming languages) but so far, I’ve enjoyed using BlueJ which attempts to keep the bells-and-whistles-count low, and the ease of use high.

Programming 101

A couple of weeks ago, I began a course in Introductory Programming at the IT University. I don’t know a lot about programming, and I certainly don’t have a lot a training or experience programming, so I’m starting at the very bottom and working my way up.

The course focuses on the programming language Java as the central element of our learning. My friend (and local computer guru) Stefan says that Java is a pretty decent first language – as his teachers at the Computer Science department warned that learning a too easy or soft-around-the-edges language (whatever that might mean) first will give you bad habits that will make even worse when you try to learn something more rigid.

Apparently, Java is good mix of easy and rigid – so we’ll see about that.

Since I’m interested in examining how people learn and use programming languages and how this affects their relationship with the computer, I thought it would be relevant to write about my own experiences learning Java. I expect that this will sort of a weekly feature, describing what I’ve learned about programming in past week, what difficulties have arisen, what I have produced, and about the general atmosphere and tone on the course. I won’t go into all the details, but even so I still expect this to be pretty lengthy.

Since I’ve already managed to get behind schedule on this weekly feature, they’ll be a lot stuff about programming here this week…


Programming 101

Apparently for the first time, the course is being taught in English. Though both lecturers are Danish, a fair few of the students at the ITU are various sorts of foreigners – quite possibly attracted to the class because of the relatively few English-language courses at the ITU.

88 people signed up for the course (I didn’t sign up, because the ol’ institute wouldn’t accept it as relevant for anthropology – I suppose it still is a rather radical thought) and maybe 10 or so of these are female. The 88 is motley bunch of full-time students and open university students with proper jobs and lives outside the university (and who can thus afford to pay the 6.250 Danish kroner the ITU charges for the course).

The ITU course system allows for people to switch courses within the first two weeks of the semester, so this number is likely to change. On the “Prerequisites” slide of the lecture, the lecturer put “Don’t Despair” underneath “User level computer skills”, “e-mail, browsers” and “some word-processing”, and generally this course has a reputation for being difficult.

The lecturer slyly answered this concern by saying: “It’s definitely doable.” Just before he launched into a lengthy discussion involving lots of acronyms, descriptions and metaphors under the heading, “So what is a computer anyway?”

“It’s not a hard disk – so don’t call it that. It’s layers upon layers of technology. From the hardware layer (the keyboard, monitor, memory, CPU, etc.) to the operating system layer (i.e. Windows or Linux) to the actual application layer (Word, Notepad, etc.).”

“The central element of the computer is the CPU which is the clever guy that can compare and manipulate numbers extremely quickly. The CPU only understands numbers which is also called machine language. Apart from that it is extremely stupid but because it is so fast, you get the impression of intelligence when it seems to be able to do two things at the same time.”

“It has no intelligence. It can’t improvise. It can’t figure out things for you. Don’t expect too much of computers – you will have to do most of the thinking.”

“A program is collection of commands to the computer. Internal commands are basically just numbers. In the ASCII standard way of translating letters into numbers, capital A is represented by the number 65.”

The lecturer is using so many new terms and acronyms that I expect somebody with just basic level of computing knowledge would already have trouble digesting it all. Though he is keen to point out that you can ask questions at any time. Nobody raises their hand.

He continues to talk about “thin and thick clients”, different kinds of servers, LAN and WAN networks, backbones, hubs, switches and wireless networks. He points out how the Internet is just interconnected networks sharing nothing more than cables and common protocols.

From there he touches upon the FTP, telnet, World Wide Web, hypertext, HTML, browsers and how HTML and program code is just text and that it can’t do anything on its own. That the text needs to be interpreted by a program in order to be used by the computer.

Then he introduces to kinds of files – text files and binary files. Text files are “just text” and binary files are “just numbers”. We will just be working with the text files, he assures us.

Both kinds of files have extensions that helps the operating system to recognize and use the files in a certain way. At this point he stops awkwardly and says that sometimes people don’t think about the extensions, because the operating system is so good at associating the files with certain programs.

At this point he is ready to introduce programming languages: “A programming language is a set of rules on how to issue or write commands to the computer. It is based on syntax and semantics like other languages. But it tends to be a lot stricter:”

This is correct:


This is completely wrong:


(as Java differentiates between upper and lower case letters)

He mentions a bunch of programming languages such as C, C++, Haskell, Prolog, Fortran and LISP, and various schools, designs, algorithms, and different ways of doing things. Some are “functional” and “procedural”, whilst others – such as Java – are “object-oriented”.

He then declares that that really isn’t all that relevant at this point. It isn’t entirely obvious as to why he bothers to mention all of this at this point.

But back to the central stuff:

“But how do we make the transition from an arbitrary high level language to machine instructions – the only thing a computer can actually execute?”

You use a compiler. A compiler is a program that translates the programming language into binary machine language that the computer can understand. The compiling process fails if the syntax of statements written in the programming language is wrong, or if the references to other files (for instance libraries of code) aren’t right. The compiler is very picky about these things, and generates a fair few error messages if your syntax isn’t up to scratch.

Java isn’t like most other programming languages, and doesn’t just use a compiler. It uses both a compiler and something called the Java Virtual Machine:

First you have your text-file containing the program you’ve written in Java, and that you want to run on the computer. In order for the computer to recognize it as a java program, you need to name it with the extension “.java” – in this case, the file is called “”.

You then use a compiler, for instance the one called “javac” to compile the program into bytecode. bytecode is not exactly “just numbers”, but a lot closer to pure machine language than java itself.

This bytecode can then be run by a Java Virtual Machine, which is a sort of interpreter program that can translate the bytecode directly into the binary language that the local computer can understand. The reason why Java contains this extra step is that it allows Java to run on many different systems – actually as many as there are different versions of the Java Virtual Machine.

It has to be said that most computers use different chips and architectures which apparently is the concrete, physical engineering layout of the interior of the computer. Depending on the computer and on how the individual operating system utilises the available resources, it can be very difficult to make a program run on than one kind of platform.

Java is so lauded because it with its Virtual Machine is “platform independent” and therefore code written in java and compiled to bytecode (which has the extension “.class”) works on most kinds of computers – and even mobile phones and other electronical equipment.

Even though most programming languages have very tight demands on syntax, they’re much more lax when it comes to formatting. Our lecturer gives us several examples of how different the same kind of code can look, and still execute exactly the same:


public class Happy {public static void main (String[] args){System.out.println(“Oh happy day!”);}}

Or this:

public class Happy {
public static void main (String[] args){
System.out.println(“Oh happy day!”);

Or even this:

static void main
System.out.println(“Oh happy day!”);

This all depends on the use of Whitespaces: spaces, tabs, new lines. There are various conventions, “cultural variations” and norms around how to format your code. It is all optional, but it is good practice and it makes it easy to read – which is vital when comes to having other people understanding and commenting on your code.

“As for comments: It is central to use them on a higher level of abstraction: Comment the code so as to express motivations or the ideas behind a certain chunk of code.”

“A good comparison is people who write assembler code (very low-level computer code, much like java’s bytecode) use to write their comments in higher-level computer code such as Java. In this way, it is easy to understand what function they’re trying to achieve through the assembler code.”

Because Java is platform independent, it is also the programming language of the Internet: That means that they’re three kinds of Java programs:
– ordinary programs that react to keyboard input
– Graphical User Interface (GUI) programs that react mainly to mouse input and responds through dialogue boxes – and are dependent on a graphical Operating System.
– Applets – small programs that can run directly in a web browser such as Firefox or Internet Explorer.

He ends the lecture by showing us how differently the code behind just a simple program that writes one line of text looks in for the three different kinds of Java programs:

Ordinary Java. GUI Java. Applet Java.

Damn, this turned out to be awfully long. And so far, I’ve been averaging 6 pages of notes per lecture. There is just so much that can be taken for granted, and if you examine it with just a tad more inquisitiveness (is that even a word?) you find lots and lots and lots.

Programming languages offer huge amounts of conventions, embedded ideas and cleverness to consider. And on top of that comes writing my own programs.

We weren’t really expected to be able to do a lot of programming at this point, as this is the introductory lecture. So there was no home assignment.

The new tower of Babel

Having been ill for a couple of days this week has left me completely zonked. This resulted in a rather bizarre case of insomnia which brought me through Samuel Delany’s Babel-17 last night. Uh, Spoiler alert!

Babel-17 reminds me a lot of Neal Stephenson’s Snow Crash. Both books use the idea of languages that function as programming languages on the human mind – both supposedly heavily inspired by the Sapir-Whorf hypothesis.

Also, both books share the annoying trait of having main characters who simply can do no wrong. The protagonist of “Babel-17”, the galaxy-renowned poet, linguist, cryptographer, space captain extraordinaire Rydra Wong is at 26 able to routinely translate stuff into Basque and a dozen other languages, shoot bad guys and reprogram herself with the mental programming language code-named Babel-17.

The central character of “Snow Crash” is called Hiro Protagonist (har, har) and is a master computer hacker, greatest swordsman on the planet, one of the inventors of the current instance of the internet as well as being suave and cool.

It’s just too much. Much better are the other characters, the curious young skater girl Y.T. in “Snow Crash” and Rydra Wong’s crew in “Babel-17” who all seem much more human and worthwhile compared to the demi-godly abilities and attitudes of the protagonists.

Anyway, the central idea of a symbolically precise language focused on exact expression of statements is obviously inspired from computer programming languages which also have been linked to the Sapir-Whorf hypothesis from time to time.

There are thousands of computer programming languages – some of which have been used for almost 50 years and have as organic and vital a history of use as most living languages. Some people do think of these plentiful ways of interacting with the computer as a new tower of Babel – it would indeed be true Science Fiction if they managed to combine that into something biological that would “run” on people.

I guess most anthropologists have given up hope of finding those secret underlying patterns that shape the common human life experience. That was actually the main goal of most anthropologists up until and including the structuralists. Nowadays, I guess we feel lucky if we touch upon something that may have wider application than just those narrow fields of study that we have submerged ourselves in independently of each other.


Today I’ve been searching for various funds and grants to apply for in relation to my impending fieldwork. I’ve been looking at all sorts of grants: from “hotel owner Anders Månsson’s Grant” (which sounds like something from Monopoly) and “Queen Margrethe and Prince Henrik’s Fund” to various corporate research programmes. Eventually, as I widened my search even further, I went to look at the EU research funding frame programme which has absolutely truck loads of money to dispense with.

Since they didn’t really say who they expect to apply for this money, I read through various calls for proposals and and FAQ’s regarding these (all of it being prime examples of EU bureaucrat legalese. At its most readable, the bureaucratic might becomes human in its almost playful answering of questions [a pdf].

Only when I found a link to a Danish institution solely dedicated to the matter of applying for EU funds, did I realize the scope of the project of getting part in these EU funds. There’s an intricate guide for applying for funds, and there’s even a fund under the Danish ministry of Research and Technology dedicated to support smaller or mid-sized companies economically with the expenses involved in preparing and sending in an application to the EU. This support is for half the total expenses – usually up to 100.000 Danish kr.!

How can sending in an application in any way be that expensive??! They routinely spend more than my entire proposed budget just supporting the writing of an application??!

I know it’s a lot of money – but even so: the bureaucracy involved boggles the mind.