Rethinking Linguistic Relativity

We discussed the Sapir-Whorf Hypothesis (also called the Linguistic Relativity Hypothesis) many times, and it is a topic that is worth revisiting if you are interested in the way language influences mind. While it is true that what we know about natural languages doesn't have to match the way programming languages influence our programming style, it is still worthwhile to discuss the possibility.

For many years Sapir-Whorf was considered very problematic, since empirical findings didn't seem to support the theory: no real cognitive differences that could be attributed to language were found. More recently, linguistic relativity became more respectable, and indeed important work is being done in this field.

I suggest reading Gumperz and Levinson's introduction from the book they edited Rethinking Linguistic Relativity (1996, Cambridge University Press).

Also from Levinson is Language and mind: Let’s get the issues straight! from the 2003 book Language in mind: Advances in the study of language and cognition (D. Gentner & S. Goldin-Meadow, eds.)

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Life without numbers

In light of all the recent discussion (this one included) of the Sapir-Whorf Hypothesis, I thought I would post a link to an article that would be of interest to everyone:

The Piraha tribe was

The Piraha tribe was discussed here before.


Ah sorry about that. I have obviously missed it both here and on newscientist

No problem. It's better to

No problem. It's better to post something that was already mentioned than the other way around. I just wanted to alert you and others to the previous discussion.

origins of grammars

I found this comment, from this paper, interesting and somewhat relevant to this discussion:

"Chomsky's generative system of grammars, from which the ubiquitous context-free grammars (CFGs) and regular expressions (REs) arise, was originally designed as a formal tool for modelling and analyzing natural (human) languages. Due to their elegance and expressive power, computer scientists adopted generative grammars for describing machine-oriented languages as well. The ability of a CFG to express ambiguous syntax is an important and powerful tool for natural languages. Unfortunately, this power gets in the way when we use CFGs for machine-oriented languages that are intended to be precise and unambiguous."

Language influences mind and programming language design!

...this would've probably been more relevant a few days ago, when I saw this post and registered...

For many years Sapir-Whorf

For many years Sapir-Whorf was considered very problematic, since empirical findings didn't seem to support the theory: no real cognitive differences that could be attributed to language were found.

Do note that this is not at all the way Levinson puts it in the second linked paper. Rather, he's claiming that the evaluation of Sapir-Whorf was through the prejudices of "naive nativist" thinking.

Ooops. I meant the third

Ooops. I meant the third link.

True, I presented a

True, I presented a different view. However, keep in mind that the two views aren't as incompatible as you might assume. The experiments that failed to show cognitive differences were designed to support a "nativist" view of cognition, which may be the reason why they failed.

Example: Many experiments dealt with color cognition (identifying and remembering colors) whereas current discussions spend alot of time on pragmatics of language use, e.g., deixis

Languages are different!

I have often been impressed by the fact that Greek has many words and groups of word associated with concepts that don't occur in English. In order to understand many religious and philosophical issues it is necessary to adapt these Greek conceptual systems and awkwardly translate them into English. Surely there is a big difference between Greek and English.

Language vs. Vocabulary?

Although I think you have a point about the Greeks as thinkers, I can't help but think that English as a language is capable of borrowing words from the Greeks: logos, agape, etc... Many of our words are stemmed from ancient greek and roman words - e.g. philosophy (philos=love and sophia=wisdom). Indeed, it is the ability to borrow words freely from other languages that gives english some of its innate expressiveness.

It is true that some of these words that are borrowed have the problem that they get a subtle and not-so-subtle shift in meaning, such that what it means to be a philosopher in modern days is not the same thing as it was in Plato's time. But then, this is also true of the modern day Greek language versus ancient Greece.

[Edit Note: I guess what I meant to say is that what you associate with the power of the Greek language probably has more to do with the collective experience (and wisdom) of the Greek culture through time. Words are just a collection of symbols and their meaning can only be given in terms of other symbols. So, to know why the Greek language holds power is not to define the words in terms of other words, but rather the relation of those words with Greek history.]

Language ambivalence

Certainly English borrowes words but this is a complex inter language process made possible by multilingual scholars who take great pains to explain the subtile differences of terms. But while this helps to explain a lot it is still difficlut to translate "logos" into english and this infulences the way English speakers think about the world without a scholarly gloss explaining the difference between "logos" and "word" for instance. I am surely no expert but perhaps there is a point here. Most of us know more than one computer language, and probably know a little about other spoken languages. When we think about language we rarely think in terms of just one language. Computer languages tend to gather concepts from other languages just as english gathers terms from Greek. Perhaps this explains the ambivalent attitude of most programmers toward computer language.

But thinking that the language doesn't matter is a dangerous attitude. Computer languages have core differences that make the job easier or a lot harder. Choosing a computer language is a bit like choosing a coordinate system in physics. The wrong choice can make an easy problem very difficult.


I think it would be helpful for the current discussion to mention Quine's indeterminacy of translation thesis.

Also, it is useful not to consider vocabulary to be an easy concept, with languages simply absorbing new words - complete with their exact meaning - from other languages. The issues this raises fall under semantic holism, another Quine contribution.

Notice that recent work on S-W doesn't concentrate on vocabulary, but rather on pragmatics.

The tri-fold conundrum

Well, my comments above can simply be taken as examples of the three fold conundrum you refer to. I am sure that there are English terms and phrases that don't translate into ancient Greek! But isn't this the point of S-W? The language influences the way native speakers without multilingual scholarship think about the world. But they are thinking about the same world so there must be some form of overlap or morphism between languages. Perhaps a little behavioral theory and coalgebra would help?

Some ramblings

Hopefully LtU can tolerate some uninformed ramblings...

People who argue against Sapir-Whorf seem to argue that "Of course it's possible to conceive of this concept even if your language doesn't have a word for it!" Which, in my understanding, isn't really even the point of S-W.

I think the idea is just that the language(s) you speak influence the way you think, which seems to me very obvious (not that this makes it true...).

Biblical Hebrew had only two tenses, and they didn't necessarily correspond to past, present, future. English has more. Classical Greek verbs had:
- 7 tenses
- 3 voices
- 4 moods
I'm overly simplifying things here, for instance Hebrew had an imperative and could express ideas similar to voices through verb stems, and English has ways to express those concepts as well. But the point is, every time a speaker of ancient Greek wanted to say a verb, he had to think about distinctions that English and ancient Hebrew speakers rarely think/thought of.

A Java programmer is certainly (one hopes) capable of understanding the concept of higher order functions, but he may not know about them yet, and is less likely to use them (or simulate them) than an ML programmer, who most certainly knows about higher order functions and probably uses them extensively. You can't learn ML without also learning about higher order functions. You can learn Java without learning such.