"Critical code studies"

I'm interested in hearing what people who study programming languages think of the emerging field of "critical code studies" in the humanities.

Here are a couple of descriptions from a recent CFP and an essay by Mark Marino:

Critical Code Studies names the practice of explicating the extra-functional significance of source code. Rather than one specific approach or theories, CCS names a growing set of methodologies that help unpack the symbols that make up software.

Critical Code Studies (CCS) is an approach that applies critical hermeneutics to the interpretation of computer code, program architecture, and documentation within a socio-historical context. CCS holds that lines of code are not value-neutral and can be analyzed using the theoretical approaches applied to other semiotic systems...

CCS is largely distinct from the more ethnographic work in "software studies" by people like Christopher Kelty, whose book Two Bits has been discussed on LtU. Marino held a CCS working group session this spring, and there's a CCS workshop at ACM Hypertext this year. Some important texts for the field are Katherine Hayle's "Traumas of Code" and Rita Raley's "Code.surface || Code.depth".

I'm personally skeptical—not necessarily about the general idea, but about the current direction of the field—for a few reasons:

  1. A lot of CCS work is written in dialects of crit-theory jargon that I don't claim to speak fluently (and I'm a humanities grad student), but the parts I do understand often seem deeply confused or misguided. Here's a quotation from a post by Marino on the CCS blog, for example:

    Yet, somehow, I can't help but wonder if slower is not sometimes better? (Humanities folk can afford to ask such questions.) Could there not be algorithms that do a better job by including more processing cycles? Such a naive question, I know.

    You don't have to dig very far in the links above to find many other examples like this.

  2. The focus is very strongly on imperative programming. Haskell and Scheme score zero mentions on the CCS website, and Lisp appears once (in a bibliography). In my experience this is representative of other work in the field. An imperative-only approach doesn't seem like a very interesting or thoughtful way to tackle a "semiotics of code".
  3. As far as I can tell, not one of the three-dozenish scholars listed as associated with the CCS blog or the recent working group has a degree in CS or math (most are currently in new media or English departments). Maybe this is by design, given the goals stated above, but I'd still like to see more of an indication that this "growing set of methodologies" is of interest to people outside the humanities (if in fact it is).

Is there a place for "a semiotics for interpreting computer code" in the humanities? Do you PLT folks need help "unpacking the symbols that make up software"?

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

I had never heard of

I had never heard of Critical Code Studies before, but I looked at some of your links. The common thread in them seemed to be a focus on studying how people, both experts and non-experts, talk about computers and software in both technical and non-technical contexts.

If the goal of that study is to understand something about human interpretation, language, thought patterns, etc. then this sub-field is somewhat of a piece of with other emergent foci that pop up, play out, and tend to eventually fade within humanities departments.

If the goal of the study is actually, as you suggest above, to help engineers improve there understanding and analysis of computer software, then the methods and approach don't seem like something most any engineer could recognize as a contribution.

There is a recognized kind of engineering called human factors, and I can imagine a serious study of the human factors of literate programming. But that didn't seem to be the kind of thing that the writers of those papers were on about.

Relevance outside of humanities

When Hayles writes that "code is the unconscious of language" or even that

Code in this view acts as the conduit through which traumatic experience can pass from its repressed position in the traumatic aconscious to conscious expression.

she thinks, at least, that she's helping people who work with code to understand it more deeply. I doubt, however, that she'd claim that this sentence could help engineers "improve the analysis of computer software" in any literal sense.

Marino offers a "note of caution" to "computer scientists who discover this project and view it with skepticism and bemusement": he says they should be careful

not to regard the humanists and posthumanists as either infadels or kooks — but rather collaborators whose articulations enrich the discussions of code.

So yes, I think the CCS folks do see their work as having relevance outside of the humanities, for people who write code or think about programming languages. The implicit model is probably something like film theory, which has had an influence on the way films are made.

film theory vs. programming theory

Bad analogy.

Film is an art, film theory attempts to deconstruct this art in order to understand films' relationship with reality.

Programming is not an art. Programming theory does not attempt to deconstruct programs to understand their relationship with reality.

While I tend to agree with

While I tend to agree with you, the assertion that "programming is not an art" seems at odds with all of the people that claim that "programming is more like writing an essay than engineering a bridge", that hackers and painters have much in common, or that programming is not a science but a liberal art.

I agree that CCS and film

I agree that CCS and film theory aren't really analogous—I was just speculating about how CCS practitioners imagine the relationship between their work and the stuff they're theorizing about.

Programming theory does not attempt to deconstruct programs to understand their relationship with reality.

I actually think this is exactly what they claim to be doing, except that they've replaced the rusty deconstruct with the much shinier unpack.

seem deeply confused or

seem deeply confused or misguided. ... for example:

Yet, somehow, I can't help but wonder if slower is not sometimes better? (Humanities folk can afford to ask such questions.) Could there not be algorithms that do a better job by including more processing cycles? Such a naive question, I know.

I don't understand the context, but this statement seems spot on of most micro-optimizations. It's even a guiding principle for algorithm synthesis languages like Sketch: PL designers, instead of focusing on cooperative bug finding on illegible code, might be better off helping in cooperative transcription of legible to illigible code.

The thread on "On the (Alleged) Value of Proof for Assurance", in regards to the usability of Z (~mathematical notation) seems to support the investigation of semiotics. When we propose an alternate semantics/model, syntax, or IDE, we might do so to facilitate machine-handling of proofs, performance, etc. -- or, as in that thread and supporting this discussion, to address some sort of cognitive dissonance. I sure wish there was more knowledge about this. It seems more useful for the typical language designer than most POPL papers nowadays if done right (e.g., more in the vein of cog sci experiments).

Finally, domain specific, rule, natural/sloppy, and logic languages would probably be interesting candidates, but I can't fault a researcher for targeting the biggest class of language for their initial work.

Seconded

The thread on "On the (Alleged) Value of Proof for Assurance", in regards to the usability of Z (~mathematical notation) seems to support the investigation of semiotics.

I second this assessment. There is a fairly active cottage industry in the semiotics and cognitive science of mathematics, particularly as it relates to math. education, so an investigation of PL pragmatics and the psychology and semiotics of programming should provide a much needed contrast. In addition, a number of philosophical issues have been raised about computer science (see this review article in the Stanford Encyclopedia of Philosophy), some of which could well be of interest to scholars in the humanities.

Lastly, there are some niches where code is considered an art form--such as the demoscene and the "hacker" cultures which arose in academia during the 1960s and 1970s. There are even uncanny resemblances between the style of demoscene animations and the most experimental kind of video art.

Probably not the kind of project CCS folks are moving toward

Given the context, I think this is probably an unreasonably charitable reading of Marino's argument.

Marino characterizes computer science as having efficiency in terms of "processing cycles" as its "fetish" or "ultimate standard". Your examples highlight the fact that this characterization isn't entirely accurate in the first place. Marino also doesn't define "do a better job" to mean anything so specific as "be more legible" or "have fewer bugs".

More broadly I think the kind of enterprise you're describing presupposes a level of training in CS/formal logic that the current crop of CCS folks don't seem very keen on acquiring.

After all, why bother with maths when you can just riff on the "heteronormative superstructure" implicit in the code of the AnnaKournikova worm?

I found the presentation

I found the presentation style unsuitable for consumption by scientists and engineers and agree that domain knowledge helps when examining a domain. However, that does not mean CCS has nothing to offer and optimization as a fetish does sound somewhat right. I think most here agree that the current approach -- both for confirming their theories (e.g., not doing so) and the presentation style -- limits their impact. My assessment can be way off base as I couldn't really motivate myself to get through the writing and the intros didn't do anything to help.

Slower is not better because it's slower.

Well, I don't know the context either, but the first time I read it I interpreted it as "sometimes an algorithm is better because it uses more CPU cycles", which of course is nonsense.

Even in cryptography, where people attempt to devise algorithms that must use a certain amount of CPU time, an implementation that can perform the same computation using less resources is still better. (and more to the point, you generally want to assume your adversary has access to that better, faster implementation anyway.)

Of course, an implementation that uses more resources can still be better, if it's more reliable, more maintainable, simpler code, etc.

slower v. nonsense

If "sometimes an an algorithm is better because it uses more CPU cycles" is nonsense....

then why is there a saying, among computing professionals, that "premature optimization is the root of all evil"?

Reasons are important

Sometimes an algorithm is better despite using more CPU cycles. The general reason is not that the algorithm is better suited to the problem, or a more elegant expression of it, but rather the programmer's time was better used in finding a worse approximation that is good enough.

Well put.

Well put, although even if you spend time and effort finding a better approximation that's a bit faster, you might do well to stick with what you had before.

reasons and words are important

Sometimes an algorithm is better despite using more CPU cycles.

What this comes down to is contention about the definition of "better" as it pertains to an algorithm.

Leaving aside "the programmer's time", for many problems there are a variety of algorithms available which make different trade-offs of space vs. time.

For example, consider regular expression pattern matching for a pre-specified regular expression. That is, a hard-coded matcher for that one pattern. If the pattern is length N and the string we are to match is length X then we have a choice that includes an O(N*X) solution and an O(X) solution.

Which is the "better" algorithm?

In some cases the O(N*X) solution is certainly the better choice "because" it uses less memory. Only, it is the essential nature of regular expression pattern matching that it can not use less memory unless it also uses more cycles. So, we might not say "O(N*X) is the better choice *only* because it uses more CPU cycles" but we could say "because it chooses to trade time for space". Using more CPU cycles is a virtue of the algorithm that makes it a better choice in this territory.

Is either the O(N*X) or O(X) algorithm "better" in any absolute sense?

Space and time are not even the only considerations. Nor are space, time, and programmer time.

The linguistic subtlety of when to and when not to use the term "better" is arguably important depending on how, in fact, it fits into the surrounding discourse. When the algorithmic complexity expert says that "The O(X) algorithm is better," other experts might have the presence of mind to read that strictly as "... is faster" but the discourse of practicing programmers, their bosses, their customers, and so forth is not always so careful and not always so easy to correct.

Even when everyone involved is speaking carefully, the decision making process about which is the better choice of algorithm in a given circumstance often depends on a number of factors which are not objectively in sight, but which are governed by the discourse surrounding the question.

In a post-mortem we might say "They chose a bad algorithm by spending too much trying to minimize CPU cycles. They would have chosen better by picking an algorithm that used more CPU cycles." In cases like regular expressions, choosing the less speedy algorithm would then not be choosing the "better" algorithm in spite of its using more CPU cycles, but because that is what the nature of the problem required of a good choice.

Good points.

The word "better" does have many subtleties associated with it. Perhaps this is because in common (non-technical) usage it is usually making a comparison between two things in a total order. As you point out there are several possible axes on which we can compare algorithms; one reason to avoid "better" as a comparison is that each context will imply a different importance of the various axes and so in many cases the comparison is over a partial ordering.

Nice choice of example, it certainly didn't spring to (my) mind as a domain where using more cycles is an improvement. In a general setting it seems that you would need to know more than one algorithm is O(N*X) and the other is O(X). In particular lower-bounds would be useful to ensure that you are not trading too much time in exchange for the space. In general if there are external constraints on the axes being measured then it would be nice to know if there is a Pareto Front and where the algorithms lie relative to it - but that suggests an entirely different usage of "better" that is getting further from its common meaning.

I think it's a (pointed) joke

The style of the linked comments is like late 80's or early 90's bad lit-crit of the most pompous variety. If you factor out the effects of the jive-filter, they say simple true things in anything but plain english.

It's true enough that the meaning of code (in the ordinary sense of "meaning") extends very far beyond any PLT meaning of the code. It's true enough that you can give philosophical and political readings to code. Indeed, those are important perspectives that go far beyond what those in the field typically do.

But, basically -- the ... wait let me drop into jargon... "the center of this formation of CCS discourse is defined by the conjunctive nexus of formerly separated discursive forms, joined here by an ironic and nihilistic comparison of discordant measures of value and meaning. On the one hand there is the objective fact of programmatic source code texts. On the other, the meaning assigned those texts by those who trade on them in the context of socio-economic play. On the third hand, they have meaning for those concerned with analyzing the meta-dynamics of socio-economic play. On the forth hand, code contains meaning communicated solely among those programmers (workers) who physically manipulate it (of which comments are but the crudest example). "Meaning is a product of circulation through society." In short, many communities of discourse and a great many of potent, generative, socially significant issues can be unified in a single, self-serving, perhaps paradoxical or perhaps parodic structure by calling attention to the complexities of technical vocabulary formation by, well, frankly, abusing it with institutional support.

For, indeed -- if you can hack this kind CCS crap for personal gain just by building up a lot of jargon around kernels of truth... um... what do you suppose the programmers or their masters are up to? If you can get that far BSing around kernels of truth in lit-crit regarding code then.... well, how far can you get just BSing directly about code instead of the broader meaning of code? And, isn't the code you're BSing about politically significant, just like this CCS crap?"

Oops, I guess I dropped out of jargon at the end there.

"Once the rockets go up, who cares where they come down? `That's not my department,' says,"

-t

[edit: to really be true to the CCS style I should have included a bunch of effusive and gratuitous namedrops. To make up for my tactless omission I'll mention the invaluable contributions of Ehud Lamm for his radical contributions to the creation of novel and productive rhetorical forms.]

Sokal reloaded

Same impression here. When Sokal wrote his hoax for "Social Text" poststructuralism ( and Derrida ) was still alive. Don't see the point of this one.

I'd rather like to read why Curry-Howard prevents Armageddon or how capabilities are the new utopia and will form the next society. Return of the great tales, religions, ideologies. Maybe I have to write it myself.

BTW I've seen some philosophers recently jumping on the term "object oriented" for a set of realist ontologies. Not sure this will be a fad that infects academia or even the feuilleton but lots of popular science is to be expected and people shall not say they were not warned.

indeed

Indeed.

Though, it would be fun to read about how waterfall design is really a patriarchal response to chaos, or of the mercantile colonialism imposed by industrial Java's ManagerFactoryFactories.

It would only be too easy for a newcomer to take on the trappings of some programming noun kingdoms (there: hierarchy, domination), and pronounce with all the meaninglessness of some current practitioners...

it would be fun to read

it would be fun to read about how waterfall design is really a patriarchal response to chaos

Then you need to watch this YouTube: The Rise and Fall of Waterfall

More cycles

Marino wrote: "Could there not be algorithms that do a better job by including more processing cycles?"

Well, there is one area where an implementation can do a better job by including more processing cycles: cryptographic implementations where you don't want power consumption to be a covert channel.

So this particular monkey on this particular typewriter did catch a fish by casting a very broad net (deconstruct that mixed metaphor!).

Critical Critical Code Studies Studies

This gives me an idea: I'll invent a new field called Critical Critical Code Studies Studies that will be devoted to constructing interpretations that make cryptic CCS pronouncements make sense. Now I just need to decide whether the acronym should be "CCCSS" or "C3S2".

a reply

Travis,

You seem to have genuine interest and insights into this emerging field. I invite you to come to our conference and begin to participate in our discussions. Your insights and vision can help shape critical code studies or code studies more generally to be something that you find to be useful.

Thank you so much for bringing my article to the attention of this community. I think it is time to open this discussion between humanities scholars and computer scientists more fully. To see a humanities scholar's essays circulating in this community is a good sign. At the same time, it is important that we not re-engage in the Science Wars.

I realize that I have brought some of the battle back with my tone and my elliptical writing style. I have written mostly to new media scholars who are well acquainted with critical theory (and use it regularly) and who also value interpretation itself in the mode of cultural studies. I presented the paper at DAC (Digital Arts and Culture) which tends to assemble a like minded community. As a result, I took for granted a common set of assumptions regarding interpretation and criticism itself.

Unfortunately, my tone seems to bring out less than civil attitude in my respondents, which while understandable, is an obstacle to this conversation.

While I do not currently have time to answer everyone of your critiques, I did want to offer a general response to some of the larger points as I sense in your writing a sincere interest in the potential for critical code studies.

Imperative Programming.
So far I have published most of the CCS work and that is the combination of a few paper presentations, 2 articles, and a handful of blog posts. It is too early to describe what CCS is, and you should not base your sense of the limits of the field on my preliminary work. In fact, I hope you and those gathered here will help decide what they want to make CCS into.

Over the course of 6 weeks this spring, we held an online working group in which over 100 scholars began their initial work on critical code studies. They came from countries in the Americas and Europe and from a wide range of disciplines. Many were "new media" scholars who also saw themselves as programmers. There were a few, certainly too few, computer science scholars. I would certainly like to see more.

While many of the discussions focused on imperative programming, Haskell and Scheme became more central topics in week 5 when Steve Ramsay presented his reading of a Live Coding performance in Impromptu.

Again, please don't limit CCS to what has been published so far. This is certainly just the beginning and even your discussion here is helping to shape it.

Sokal.
I understand why you raise Sokal in this instance. But I don't see his prank as dispositive. The lesson of Sokal for me is that humanities scholars and computer scientists need to work together more to ensure that the interpretive work of humanities scholars has a strong basis in actual science, that interpretation is based in the facts of the science.

Maybe there is also a lesson about the use of jargon. I don't know what to say there. I teach writing, and I know we need to use different jargon for different audiences. My audience, scholars in new media, generally understand the terminology of critical theory. They have read the texts I reference. I can speak to them in the domain-specific language they understand without elaborate explanations. That can make the writing difficult for outsiders. That is one bridge I would like to build.

You spend a lot of energy discussing (attacking) one post I wrote in about 15 minutes while reflecting on a lecture by a computer science friend of mine. I was meditating on this notion of efficiency, of the norms and values of the programming community. In the process, I asked my question about inefficiency. I find it instructive that in this conversation thread, this community finds that question both inane and insightful. (Someone makes a remark about a monkey getting it right -- referencing one of my favorite analogies, btw. Borges has a nice take on this.) I raised that question using ordinary humanities style critical thinking as part of my meditation. When reflecting on people's claims, it is conventional to ask about the opposite claim. Apparently it was a good question. You claim it is a sign of profound ignorance. I think the way I asked the question drew your irritation.

I am a humanities scholar, a digital humanities scholar, if you will, not a computer scientist. I have taken courses in programming and do some programming for my creative works. But I do work very closely with computer scientists, in consultation with them, to understand their paradigms and programs. My goal is to everything I can about the objects I am interpreting.

My area of expertise, on the other hand, is interpretation. (That does not mean you will like all of my interpretations.) I pursue the meaning in cultural objects by examining them in their context. I believe that the pursuit of meaning is what makes us human and also admit that we produce meaning when we interpret. I also have a counter-cultural disposition. That is the "critical" side. I am trying to name patterns I see in the interplay of institutions, technologies, societies. So my readings tend to favor the political margins, which is no doubt a turn off to more conservative readers.

Someone else on this list asserts that Derrida and post-structuralism are dead. One of those is sadly true. Post structuralism, however, has given rise to much of the methodology and schools of critical interpretation that are very much alive today. Its basis tends to be a suspicion of authenticity and authority. You might see such suspicion even in this current thread, even in Sokal's alleged field-killing stunt. Deconstruction is a kind of unforgiving close reading that goes after the basic tenets of the objects it is critiquing (CCCSS, for example).

Believe me, I know works of critical theory can be hard to read and hard to grasp. Stuart Hall who founded Cultural Studies was himself apparently suspicious of convoluted theoretical language. But if your critique of my work is that it is insufficiently informed in the domain of computer science, I would hope that you would respect the theoretical concepts and constructs of my area, or rather in our area. Respect is certainly missing from perhaps all sides of this conversation. (And I will try to write more clearly and in more conversational diction now that I see my audience is much broader than I anticipated.)

Lately, I have been thinking that the larger field of studies is perhaps "code studies," one that can more easily bridge or too disciplines. The work that involves critical theory (of the literary and cultural tradition) could be called critical code studies.

Bridging disciplines

Thank you for this response, Mark.

My training has also been in the humanities, and I'm only at the beginning of my studies in PLT and functional programming, but I'm interested in both disciplines, and want to see bridges being built between them.

There's a long tradition in the humanities of "cross-disciplinary" projects that aren't much more than academic land grabs, by scholars who aren't particularly interested in dialog with or feedback from the sciences whose terminology / etc. they're appropriating. I think there are signs of this tendency in some of the materials I linked. I posted the links because I thought there might be interest here in this new body of work, but I wanted to make my reservations clear up front.

I'd love to see a CCS that doesn't make these mistakes, and my apologies if I've been too hasty in my criticism. I'll contact you personally off-forum to follow up.

When theorists describe a

When theorists describe a new abstract approach to a given problem domain, they usually provide...or otherwise are asked to provide...an example where their new approach can be seen to work well and deliver new insights for that problem domain.

What concrete examples of problem domain and interpretation would you like to hold out as successful applications of CSS that have yielded new insights?

On clarity

[...] Believe me, I know works of critical theory can be hard to read and hard
to grasp. Stuart Hall who founded Cultural Studies was himself [...]

Believe me, I know LtU readers are used to grasp difficult to read theories. I wouldn't
blame LtU readers for not being able to "grasp" whatever those works have to say.

[...] apparently suspicious of convoluted theoretical language. But if your
critique of my work is that it is insufficiently informed in the domain of
computer science, I would hope that you would respect the theoretical concepts and
constructs of my area, or rather in our area. Respect is certainly missing from
[...]

With all respect to the theoretical concepts and constructs of your area: yes,
you are insufficiently informed in the domain of Computer Science, as some other
readers have already pointed out.

To begin with: "code" can be a byproduct of a software design. "Code" can be
automagically generated from a formal model, for instance, so there're no
"socio-historical context" to study there.

"Code" can also be created by hundredths of different people, from all around
the world, from tenths of different cultures, during a very long period of
time.

And can then be maintained (manipulated, changed, rewritten) by hundredths of
different people, from all around the world, from tenths of different cultures,
during an even longer period of time.

You may or may not be able to track the people, their culture, their location
or their "socio-historical" context when looking at a single piece of code.
That single piece of code may have been written by tenths of people, during
tenths of years.

So I'd say that yes, whatever theory that is to "study code" and derive any
result from those studies (wheter "socio-historical" or whatever) is plain
wrong. Simply because you won't have all variables and boundary conditions
needed to study the problem.

[...] perhaps all sides of this conversation. (And I will try to write more clearly
and in more conversational diction now that I see my audience is much broader
than I anticipated.)[...]

Clarity would indeed help those of us in "the broader audience". After all, and
as you may know, clarity is one of the ways to measure the quality of code.

Clarity is also one of the the qualities of good english writing, as avoiding fancy words and being clear are two of the rules in "The Elements of Style".

[...] Lately, I have been thinking that the larger field of studies is perhaps
"code studies," one that can more easily bridge or too disciplines. The work
that involves critical theory (of the literary and cultural tradition) could be
called critical code studies. [...]

Instead of changing the name I'd suggest you changing the target: Maybe you
want to study the content of the mailing lists (and other communication
channels) people use to build software.

The way people organize to build software (both inside companies and in open
source projects) is probably a great area of interest. Success of software
projects is usually due to its internal human organization, the way they
communicate and the way the project community is governed. And that's an area
that has not been studied in depth, as far as I know. All I can remember now
is this this article by Eric S. Raymond.

For an interesting working case you may consider studying the ongoing process of
standardization of the Scheme programming language, where political and organizational issues seem to have entered the game.

So I'd say that yes,

So I'd say that yes, whatever theory that is to "study code" and derive any
result from those studies (wheter "socio-historical" or whatever) is plain
wrong. Simply because you won't have all variables and boundary conditions
needed to study the problem.

Historical linguistics has a rich understanding of how natural languages change based on analyzing usage. I would assume similar (but obviously different) technique are applicable to PLs, though I'm still thinking about what the benefits would be of that particular angle.

I would caution those remarking on this thread to reread their messages in regards to civility -- a lot can be unintentionally misinterpreted. This is an interesting discussion, and, while out of the mainstream, I think data driven / HCI (psychology) / sociological approaches to looking at end-to-end systems will only become more important. When analyzing adoption barriers of many PL domains like FP, type theory, logic, or verification, and designing GPLs, hopefully it's apparent that there's something missing. I don't see how alienating someone interested in helping is a net win. So, for an example of dealing with the community gap, constructive criticism helps (as in the above post, though I suspect Elements of Style is well-known to the literary community).

on the gravity of social construction

I would caution those remarking on this thread to reread their messages in regards to civility -- a lot can be unintentionally misinterpreted. This is an interesting discussion [...]

Agreed, and agreed, though it is very possible to err on the side of excess seriousness: a code reading could hardly be complete without an appreciation of the mischief with which many hackers hack. If code is play, then commentary too.

On clarity, again

Historical linguistics has a rich understanding of how natural languages change based on analyzing usage. I would assume similar (but obviously different) technique are applicable to PLs, though I'm still thinking about what the benefits would be of that particular angle.

The forces under PL evolution are possibly completely different from those causing natural language evolution. Programming languages usually evolve to improve on clarity, conciseness, brevity, accuracy or to improve other qualities of service (such as security, portability or backwards compatibility). I don't think historical linguistics has nothing to say about security or backwards compatibility, for instance, so I'm not sure if the techniques in historical linguistics are of any use here.

[...]So, for an example of dealing with the community gap, constructive criticism helps (as in the above post, though I suspect Elements of Style is well-known to the literary community).

I mentioned Elements of Style because those of us in the scientific community appreciate clarity and accuracy in the language. In whatever language we use (programming, mathematical or natural language). The shorter and the more accurate the better.

So, for a start, I think we'd appreciate a short and concise definition of "critical code studies". Why "critical"? What is to be understood by "code"? What aspects of "code" are to be "studied" and how?

Those are questions whose (short and concise) answers would dramatically reduce the community gap. I haven't found the answers in the pointers above, though.

Definitions

Here's my take:

"Critical" here just means "coming out of the tradition of critical theory": i.e., based on some idea of dialectical critique.

For "code", the OED definition seems as good as any:

Any system of symbols and rules for expressing information or instructions in a form usable by a computer or other machine for processing or transmitting information.

Mark goes into more detail about what kinds of "code" are meant in this essay linked in the original post.

"What aspects" and "how" are obviously more complicated, in part because the field is so young. I tried to select links that give some indication of possible directions. I'd recommend in particular looking at Mark's "codology" essay—I probably should have highlighted it more explicitly in my original post.

But those definitions are

But those definitions are too vague for me to understand. The definition of code you suggest fits both the UTF-8 specification, the firmware of a washing machine and the bits of the AVI Video format, for instance.

You can study the internals of the AVI Video format in full detail, understand its source code representation in Haskell (or Scheme, or C, understanding why "Hello world" is in there), and learn about the benefits of portability and bandwidth consumption, if any. All that study won't give you a single clue on why this video format is not under consideration for HTML 5.

What I mean is that the "sociological" aspects of "code" are not in the code itself, they're elsewhere. Studying them is not different of studying the sociological aspects of any other field.

I understand that "what aspects" and "how" are more complex to define. What about defining "what objectives" CCS is trying to achieve? Is that easier?

Not sure

What I mean is that the "sociological" aspects of "code" are not in the code itself, they're elsewhere.

Are you sure about this? In every case? I'm not sure at all.

This is hilarious.

This is hilarious. Seriously, you should look into writing an article about the emotional aspects of Laplace's equation. You could open up a whole new field "Critical Equation Studies". Perhaps that would help those engineers keep spacecraft in the air.

proofs

are something of a social construct, not a completely infallible and solely logical thing.

Yes, and the reasons why

Yes, and the reasons why spacecraft crash are also mostly social. I'm not saying that studying this is worthless. On the contrary, programming languages should be designed from the start from the human point of view instead of as often happens now from the theoretical point of view.

BUT I don't think that taking your existing lit-crit skills and applying them to code results in anything. For example take this on quicksort:

One of my early experiences with CCS may serve as a cautionary tale. When I proposed an earlier version of CCS to a group of programmers, one of them asked if I could prove CCS by applying these methodologies to a program with low natural language content, even in terms of input and output. The reviewer suggested Quicksort, an algorithm used for quickly ordering groups of numbers through a "divide and conquer" approach. In my early attempts at what Wardrip-Fruin would equate to interpreting a stop sign, I suggested Quicksort as a metaphor for social organization in communities, drawing out an analogy for the way a neighborhood street or even highway may serve to divide and conquer a demographic. However, though my analysis said something about neighborhood hierarchies, it offered little insight on Quicksort itself, nor did it draw from Quicksort a lesson about the society from which it came. My main error was analyzing Quicksort aside from its historical, material, social context. For an algorithm such as Quicksort, the code meets the social realm at the site of its implementation and in the context of the application in which it is used. I was not engaging the cultural origins of Quicksort within the history of computation or even a particular incarnation of Quicksort in a language (I was using pseudocode). Without discussing the human context of the Quicksort code in terms of authorship, use, development, circulation, or operations, I was left with little more than a process

I doubt that you'd be able to learn anything interesting about quicksort by approaching it this way. Sure, the history of quicksort is interesting. But trying to understand an implementation of quicksort by exploring its historical context like you would do with a piece of natural language is not useful. You have the code and the math and the reasoning behind the code, and you have the history. Both are interesting but should be treated separately.

Actually I would be surprised if you could find a programmer who wouldn't find that paragraph ridiculous, in a similar way that humanities people would probably find it ridiculous to apply methods to analyze algorithms to natural language (like applying Hoare logic to a poem, or asymptotic complexity analysis to a book).

subjectivity

i'm overall not sure what you are saying since you say the study is not worthless but then you seem to say it is.

presumably it depends on what is defined as the point of it all. you say applying lit-crit can't result in anything. yet it apparently doesn't result in nothing, since somebody wrote that, and other people read it, and now some people are even discussing it right here. what it results in is something. then each individual will have a response as to how valuable that result is. and even /what/ the result "is".

some folks have applied algorithmic type study to lit; the various codified meters used in poetry are an old, small, but real example of that. (i'll ignore e.g. biblical numeric study as silly.)

aside: natural language and PLT

some folks have applied algorithmic type study to lit; the various codified meters used in poetry are an old, small, but real example of that.

Montague grammar can describe the semantics of natural language using the formalisms which are commonly used in PLT. This approach is severely limited due to the inherent "fuzziness" and context dependence of natural language, and it clearly cannot address extra-linguistic "meaning" and interpretation at all. Nonetheless, it's viable enough to make formal domains such as programming, proof and specification a primary target for natural language generation. Here is a natural language framework for formal proof and the kind of text it can generate. There is much scope for improvement, and obviously this is not suitable as the primary or exclusive representation of a program, but having a plugin which could accurately translate the semantics of complex expressions into natural language might be helpful as part of a broader structure-oriented approach.

I agree with some of this,

I agree with some of this, e.g.,

For an algorithm such as Quicksort, the code meets the social realm at the site of its implementation and in the context of the application in which it is used. I was not engaging the cultural origins of Quicksort within the history of computation ... Without discussing the human context of the Quicksort code in terms of authorship, use, development, circulation... I was left with little more than a process

But other parts still make no sense to me -- lack of research or practitioner level knowledge (as opposed to artist/dabbler) throw in some red herrings:

or even a particular incarnation of Quicksort in a language (I was using pseudocode)... or operations

where the latter might only matter in a teaching context (e.g., comparing languages / programming models) and thus I doubt is what was meant.

I had an interesting discussion last night where semiotics came up with a somewhat analogous example: indentation. When reading code, badly indented code raises alarm bells. Either the programmer is not aware of coding conventions and thus is somewhat of a dangerously unpredictable party or is aware and the code is fresh and likely buggy (where the debugging process would involve cleaning up the code syntactically to ease legibility). Likewise, when writing code, I may be 'sloppy' syntactically (naming, spacing, etc.) to reflect that it is fresh and untrusted. Performing a checkin or a release entails cleanup. A technical solution is automatic formatting, but that misses the whole point.

In the case of quicksort, it is significant in 1) if a hand-written sort used and 2) is it a canonical. The analogy might be with Virginia Woolf: we expect Hemingway in his seeming simplicity, so such surprising code is either the work of an idiot or a master (e.g., tuning).

However, again, I found the cited work to be illegible and lacking of rigor (irrespective of using appropriate terminology). Ideas are easy; distilling reliable (experimentally verified) and pertinent knowledge seems like the most reasonable way to engage the science (and applied science / engineering) community. Unfortunately, it's a pain, especially when unnecessary for the goal's for your home community, but there it is.

Are you being sarcastic?

Spaceship do in fact crash for mostly social reasons! The challenger disaster was because upper level managers did not want to hear "bad news" from the engineers. I heard some guru (don't remember his name) say that he never saw a software project fail for technical reasons. I agree with this, usually failure is social/economical. Of course, I'm not justifying CCS, but it has sparked an entertaining thread! Unfortunately Marks getting insulted kind of weakens his case. Even the most shallow cultural examination of this forum (or almost any CS forum) should have told him what kind of reaction to expect...

Actually when I wrote that I

Actually when I wrote that first comment I really thought it was a joke. But with the sheer amount of pages dedicated to the subject it's too elaborate to be a joke.

I'm not being sarcastic. Spaceships and software projects do fail for social reasons. However, you can't fix this by studying the equations or the code as you would study literature. You have to study the people and the incentives and the processes. OR if you want to solve a specific problem then you study the code in conventional CS ways.

They do seem to think that they can help programmers. I couldn't find how they think that they can help programmers, or how they are going to evaluate its effectiveness, though. Perhaps somebody can step in and maybe even give an example of how it helped.

Just a random thought: sometimes technical advances can solve social problems. For example if you are building a product with 100 people you are going to need an elaborate management structure. If you could improve individual programmer productivity 3 times you could perhaps reduce that to 10 people, because team productivity doesn't scale linearly with the number of people. If you are building a small product for a client on your own and you could improve productivity 3 times you could perhaps sit down with a client and build a rough version live. This would reduce communication problems tremendously, because he can immediately see whether what you are building is what he had in mind. I think programming tools are getting close to good enough to be able to do this in specific areas (e.g. web programming with Hobo).

I don't think "joke" is the right word here

Actually when I wrote that first comment I really thought it was a joke. But with the sheer amount of pages dedicated to the subject it's too elaborate to be a joke.

I don't think "joke" is the right word here. Despite its well-appreciated attention to clarity and constructive discussion, the author's reply sounded distinctively trollish; I don't see this as a negative thing, since such rethorical devices can be useful to stimulate respectful and productive discussion, especially about hidden, non-trivial issues such as the cultural and socio-political aspects of "code" and PL's.

Moreover, critical studies seems to be focused on speculative "deconstruction" meant to expose hidden patterns of power, authority, institutions etc., so it seems that some amount of institutionalized trollishness would come with the territory. However much effort may have been placed into this subject, it's not clear that the results should be taken at face value.

The Emotional Aspects of Laplace Equations

It seems obvious to me that the "meaning" of the equations have no emotional aspect, because the meaning was specifically designed to exclude such content, however, the specific symbols chosen, conventionally, to represent the relationships pointed at (the meaning) may have emotional/cultural/mythic resonances that influence the direction, style and creativity of future research and discourse. There could even be cultural significance in that the symbol sets and language chosen to communicate the relationships are more accessible to people from one culture (or gender) than another. Damn, I'm starting to sound like deconstructionist! Look at programming paradigms for children, maybe there are programming/math symbol/language sets that would go over better in the Congo that others? I'm not sure Mark understands the distinction between meaning (or execution) and symbolic systems used to express that meaning, but I'm not sure he doesn't either.

I'm not sure Mark

I'm not sure Mark understands the distinction between meaning (or execution) and symbolic systems used to express that meaning, but I'm not sure he doesn't either.

He does. There is an essay on his website that devotes considerable space to discussing the relative merits of analyses that have something to do with the functionality of some given code and analyses that don't relate to that functionality and might apply more or less as well to stuff that only looks like working code.

Inside joke

Why do I have the feeling "The Emotional Aspects of Laplace Equations" will become an LtU inside joke for years to come?

Perhaps Wadler will write a paper with this as the title!

this is hilarious

Seriously, Jules. Seriously.

so, yeah, seriously

I think that your "elliptical" writing style is an obstacle. It's not my place to give you advice. But my advice is: talk normal.

I was the one what brought up Sokal, though not by name. Others caught my drift. I've long thought that the most interesting aspect of the paper is the over-reaching nasty responses and rumors it provoked. He sure as heck did expose some intellectual dishonesty there - and I don't mean anything about Social Text.

Here in programming language theory land, we often talk about the meaning of programs. Often enough, we mean "meaning" in fairly precise, technical ways. (You have an interest in Scheme. See the appendices of R5RS and R6RS for two distinct approaches to ascribing meaning to a class of programs.)

And, here in programming language theory land, we also use other ordinary language words in strange ways. We talk about correctness, expressiveness, abstraction, objects, composition, correctness, and so forth.

Now, if those are all just "terms of art" that's one thing. For example, we could take the literature, look for all occurrences of a phrase like "denotational semantics" (which pertains to meaning), replace those with the word "snickerdoodle" - and the math would still check out. Fine.

Yet, in the discourse around programs and programming languages, we manage to not confine our use of those terms to their strict technical sense. Two anecdotes to illustrate:

(1) I've fairly recently been "shouted at" (in the email sense of shouting) for proposing programming language features that would rob programs of "meaning". I was accused of suggesting a language design for "meaningless" programs. I had done no such thing, of course. Reading between the lines I figured out that I had proposed features that were not well modeled by my critic's favorite mathematical model of the meaning of programs. His theory of "meaning" of programs was incomplete and he dubbed programs unaccounted for by this theory "meaningless". Only, he did not come and say "there is no meaning assigned to such and such a program in my favorite mathematical model of meaning" -- in the rowdy debate and in the reception he got, he conflated his technical word "meaning" with the common language meaning (and got away with it).

(2) For an interesting source of "talking about programming and programs" you might want to examine the past few months of archives from the "Scheme standardization Working Group 1 (WG1)" mailing list. There is much banality, of course, but there are actually some debates there that are quite heated. The more heated debates concern concepts like modularity, safety, and barriers of protection. For example, one issue is whether or not when two programmers contribute code to a single program, one of the programmers ought to be able to prevent the other programmer from certain actions. (Technically: should a library be able to create values of novel type that code outside of the library can not dissect? Or should any part of the program be able to dissect the values it encounters, even if it didn't create them?)

What I think these kinds of debate have in common, and where I think CCS can possibly help, is in the area that joins programming language design and programming practices to human relations and power structures. Both anecdotes are examples of contention over authority. Who can and who can not write code that dissects a particular data structure in a particular program? Who is and who is not the Humpty Dumpty who gets to define "meaning"? And, though my anecdotes don't give a clear example, how do our discourses within the field relate to the significance of programs for the larger society of people who use computers?

The field is lousy with such debates and, for the most part, is unconscious to the point of zombie-hood in relating those debates to the socio-economic positioning of programmers and computer users.

Foucault, if he could be resurrected and dragged out of the library and away from his mostly historic studies - would have a field day.

My (again, presumed slightly rude, entirely inappropriate, not my place) advice: go look around over there.

And talk normal...

Power structures and authority in PLT

I suspect (but can't prove) that the discourses about "software engineering", "enterprise architectures", "programming in the large" and the like are tightly linked to unexplored and hard-to-characterize issues of power and authority. PLT researchers and the hacker community[1] seem to both be thoroughly skeptical about the merit of these technical discourses, although the stated rationales obviously differ, and both communities seem unable to describe its hidden assumptions and desiderata in any rigorous fashion. This is one area where CCS might be able to help.

[1] The hacker community adopts a distinctive approach to software development, authority in technical disputes and the authorship of source code. Eric S. Raymond's work has given a comprehensive description of their ethics and principles, and his The Art of Unix Programming surveys the 'hackish' approach to issues which might otherwise fall under "programming in the large".

aside: Raymond v. Hackers

Eric S. Raymond's work has given a comprehensive description of their ethics and principles

I rather strongly disagree. Too off topic to debate it here? But, I at least wanted to share that there is strong dissent on the validity of ESR's narrative.

YES

Ignoring any controversy around your note 1, this is spot on:

I suspect (but can't prove) that the discourses about "software engineering", "enterprise architectures", "programming in the large" and the like are tightly linked to unexplored and hard-to-characterize issues of power and authority. PLT researchers and the hacker community[1] seem to both be thoroughly skeptical about the merit of these technical discourses, although the stated rationales obviously differ, and both communities seem unable to describe its hidden assumptions and desiderata in any rigorous fashion. This is one area where CCS might be able to help.

There are many strands of political, oppressive/repressive/transgressive, insider/outsider, cultural and counter-cultural conversation woven through and around source code. Of course this should come as no surprise. What is it about enterprise Java that marks it so clearly as "corporate" discourse? How do counter-cultural practitioners and self-appointed elites use markers in their code to identify and distinguish themselves, find one another in the night? Is writing shit spaghetti-code web sites in PHP a self-conscious expression of defiance rather than "mere incompetence"? (Absolutely yes, in some cases.) Is it outsider art? (Also yes, in some cases.) Is there a "high culture" and a "low culture" of programming? (Have you ever talked to a "real programmer" about the hodge-podge of perl scripts and hacked-up C programs that so many of their machine learning colleagues are so fond of?)

I'm strongly in favor of opening an area of study which attempts to address these issues and bring them into the light. We could certainly call it CCS, although this may not be what is currently meant by that term. I'm strongly in favor of this despite (or perhaps because of) the fact that so many programmers will undoubtedly have strong a strong negative reaction to any discussion of these issues, and even deny the existence of any issues to discuss.

Of course programs are both mathematical artifacts and social creations situated in a particular context. By asking about PHP, for example, I'm not arguing that it doesn't suck, perhaps even in objective, quantitative ways. Shantytowns suck, too, in a lot of very objective ways, and so do lots of romance novels, but nonetheless they are worth discussing and have much to teach us.

However, I doubt there are many people in the world able to do this work well.

I'm strongly in favor of

I'm strongly in favor of opening an area of study which attempts to address these issues and bring them into the light.

Do you think there are any general insights which can be distilled about Enterprise Java which poor Rubyists have overlooked in their secession wars?

The problem with CCS in the form it is presented here is not so much that there is a sociology and a culture of programming which has been ignored but it is already a matter of self-reflection, though not a systematic one and I wonder if there is an urgent need for this.

One thing which has to be understood by any serious researcher in the field: there is a desire and drive to encode political issues back into technical ones and there is no social dispute which can be isolated from that of its possible technological fixes. Or fixes of the kind of ergonomic UI design which leads to the most recent debates about Apple. So not only has technology a political dimension but it is also the other way round in a quite manifest way. I hoped that became clear during all the interpretative efforts to understand what "Web 2.0" means despite being a self referential slogan which was meant to initialize and focus this process of interpretation ( one of the best answers I heard was: "more bandwidth" which nailed it in a good old materialist fashion ).

This changes everything including what criticism means. I'd recommend reading the discussion threads again on LtU regarding the SEC proposal to use Python. It is a bit of a microscopic drama which illustrates a lot. Dürrematt would have enjoyed it.

"Code" Studies?

Take this Usenet exchange: link.

There's enough raw sociological material in there to choke an entire humanities seminar including the culture of Usenet and electronic communities in general; a community's boundary between "tough love" vs "just plain mean"; the "meet you half way" theory of education; the "show love for something by parodying" it concept of obfuscated code; the relationship between code obfuscation and other forms of post-modern art including "found art"; showing off for the sake of peer recognition; newbies vs old-guard; "insider" jokes as a way to enforce cultural boundaries; and probably plenty more.

The question is this: can't most of that stuff be found from a more sociological and anthropological point of view? Does literary criticism for code really help? The email linked is about 60% code by line count, but with the right background the sociological stuff can be inferred with only a glance at the code. The code itself, taken entirely out of context, is just bad ascii art.

More sociological point of view

You can find this "more sociological and anthropological point of view" in studies like Two Bits and Adrian Mackenzie's Cutting Code (and of course ESR's stuff, &c.). As I understand it, CCS is trying to do something slightly different.

My impression is that most people here agree that there's a clear value in the Two Bits approach, even if specific examples of it get some details wrong. I'm personally more curious about perspectives on the value of CCS, which seems to be an open question.

You can find this "more

You can find this "more sociological and anthropological point of view" in studies like Two Bits and Adrian Mackenzie's Cutting Code (and of course ESR's stuff, &c.). As I understand it, CCS is trying to do something slightly different.

Seems that way to me too. But perhaps somewhere in the anti-matter universe, someone is posting to a blog right now about how his software for detecting race conditions in multi-threaded applications might have a lot of applicability to gaining a deeper understanding of Shakespeare, and a bunch of other people are responding "Right on! Using that technology stuff for creating databases of Shakespeare's complete works that can be indexed by prosody at the sentence level might be really useful!"

Bad ascii art

Indeed, you'd think he'd at least have the decency to line up all the diamonds with the center axis of the Christmas tree...

Perhaps you could give a

Perhaps you could give a compelling example of a critical code study?

evolution of a haskell programmer

All in all an interesting read. The paper might raise more questions than answer them, but perhaps that is a good thing, both in general and in this case.

It just occurred to me that the famous evolution of a haskell programmer can be considered a critical study of programming practice in haskell, written in the language itself. As such it seems to qualify as critical code study, one that is made accessible through humour.

Rhetoric, representation, and speech acts.

Some scattered thoughts.

I'm suspicious of the more out-there elements of critical theory, and for example found the linked pages from "Traumas of Code" either quite confused, too dense to evaluate, or both.

On the other hand, Marino's linked essay seemed pretty straightforward and reasonable to me. The quotation about efficiency seemed in context, not bad, but certainly a bit naive (although it professed as much) -- written in response to an intro CS course, it seems more about how the course presented it than how CS actually looks at it. It does provoke valid issues -- for example why the focus in much CS solely on algorithmic complexity (i've repeatedly seen folks speak as though this were the full extent of CS "theory"). Similarly it raises the question of whether intro courses or otherwise often do overemphasize efficiency over other virtues of good code.

But more to the point, whether CCS is going to be the way to tackle it, I think there are plenty of important questions to be asked from the standpoint of rhetoric, sociology, genre theory, and even some more down-to-earth critical theory.

After all, if we are to take seriously one of our favorite aphorisms ("Programs must be written for people to read, and only incidentally for machines to execute.") then lots of questions which are both literary and social arise very quickly -- within the syntax of a language, how do certain conventions arise and spread, when and why do languages change, what is passed down by convention and what arises from what sorts of necessity? What do certain codebases say about the networks that they are situated in? (Do three teams working on a compiler necessarily produce a three pass compiler?)

Which is also to say that if CCS is just about the code itself, it will be fairly shallow.

CCS could probably look profitably at the linguistic turn in anglo-philosophy -- a program is sort of a cannonical "speech act" after all.

This goes both ways of course -- the historical link between PL formalisms and not only math but philosophy is quite strong. PL folks also return to the well every so often (even if in somewhat orthogonal ways -- productive misreadings, even :-)) when trying to think about new problems (c.f. Rich Hickey's quotes from Whitehead while trying to characterize the problem of time and mutability in Clojure, or hacker-culture's longstanding affinity for zen).

I'd also like to note that the n-category cafe has a very nice give and take between working mathematicians and philosophers of science. One contributing editor, for example, is David Corfield who specializes in philosophy of math. Granted, folks like Corfield come, as I understand it, from a fairly rigorous anglo tradition, but still, I think LtU could profitably learn from that sort of interaction.