In the Land of Invented Languages

Just finished reading In the Land of Invented Languages, by Arika Okrent. It makes an accessible read for many topics in language.

Natural languages may be less universal than music and less precise than programming languages, but they are far more versatile, and useful in our everyday lives, than either. Ambiguity, or fuzziness of meaning, is not a flaw of natural language but a feature that gives it flexibility and that, for whatever reason, suits our minds and the way we think. Likewise, the fact that languages depend on arbitrary convention or cultural habit is not a flaw but a feature that allows us to rein in the fuzziness by establishing agreed-upon meanings at different levels of precision. Language needs its "flaws" in order to do the enormous range of things we use it for.

Aside from this passage, the book barely mentions PL's at all. But programming languages are, by definition, invented languages (and, no, Perl does not qualify as natural), I think there are many parallels to be drawn. Most language inventors don't do it for the money - creating PL's is not a way to untold wealth. And there are a thousand invented (and programming) languages, so the chance of success is rather slim (and mostly accidental). The book itself is more an informal narrative that goes between personal experience, to examining the persons behind the languages, and on to a more critical analysis of the languages spotlighted. Although there are over 500 listed languages in the appendix, there is only in depth coverage of a dozen or so. The major periods covered:

  • Enlightenment: John Wilkins and his Philosophical Language are the main subject of this period. The 17th century saw the widespread adoption of mathematical conventions, and there was a belief that a language could be designed that removed ambiguity - words would convey meaning exactly as intended. That belief is still a central tenet in much PL design.
  • Idealism: Here we have Zamenhoff and Esperanto trying to bring about peace, love and understanding by sharing a common language. A couple of WWs would tell us that such utopian visions were not quite achieved. But Esperanto has been the most successful invented language in terms of usage. Most of the languages of this period were designed to be easier to learn, and were a mixture of languages - rather than striking out in bold semantic/syntactic fashion. Of course, we have PLs that want to borrow features from many different sources and strive to be easy to learn. Then again, efforts to reduce the number of languages usually have the effect of just creating more languages.
  • Symbols: Charles Bliss and Blissymbolics with emphasis on non-oral language in this section covering symbol language and sign language. Visual PLs is what I thought of here
  • Logic: Brown and Loglan were started as a roundabout thought experiment for Sapir-Whorf. But the only answer it would provide would be: what if, instead of trying to get AI from programming languages, we used something like a programming language for speaking, writing and communicating in the large.
  • Esoteric: Klingon and other Conlang's are discussed in this section, with the emphasis on language as art or puzzle. Esoteric PLs are similar in spirit.

Lot's of tangential topics that are fun (Chinese writing, Hebrew, Tolkien, etc) and covers some very colorful characters. Not sure if PL designers are quite so eccentric, though I suspect it's only because we are still early in the game for PL evolution.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Are "natural languages" so natural?

Edo Nyland has a theory about occidental languages, which would indicate that there are not so natural, but were actually designed just like Esperanto.

Languages are fluid

Languages tend to go in isolation for periods and then mix together in unpredictable fashions. I'm of the opinion that language can arise spontaneously, as our brain can create and manipulate language as the need arises. The opposite hypothesis I would have to Sapir-Whorf would be that people tend to influence language much more than language influences them (at least with language in a non-PL sense). (Okrent gives the example of deaf people negotiating sign language, if neither of the people can initially understand one another). Not sure I could believe that there was one language at some point in time, that suddenly got splintered (Tower of Babel).

Anyhow, as with all words, they do not necessarily convey exactly divisible concepts. The idea of invented (vs. natural) languages is touched on in an interview with Okrent:

All languages are in some sense invented in that people create them, but generally people do this as a group without being conscious of what they are doing. The invented languages I’m talking about are those where someone sat down and built a brand-new language from scratch.

Doesn't seem credible...

Well, as a person of faith, I too have trouble interpreting the pre-history stories in Genesis literally. I think that trying to look to scripture for answers to scientific questions or as a foolproof record of history is a big mistake. I don't think the original link is credible.

That's an interesting counterpoint to the Sapir-Whorf hypothesis; one that makes sense to me. Look at the evolution of language since the dawn of the internet. The use of smileys is fairly mainstream, for example.

Mother tongue

I think a hypothesis of a common language is interesting (it is mentioned in the above book a couple of times). But, from a scientific standpoint, an hypothesis would have to be judged on its utility and testability. And, other than being a useful moral story about how language can separate us, I'm not sure it brings us much further than noting that many languages share common root words. Of course, the explanation may be as simple as economic - people tend to share and interact with one another over seemingly great times and distances. And languages have a tendency to die out or blend with others. So the hypothesis could just as easily be that these other cultures had their own languages but found it more profitable to standardize on languages brought in by the greeks.

(Edited, since I want to avoid religious entanglements. :)

Edo Nyland is a complete

Edo Nyland is a complete nutball and you're absolutely ridiculous for making that comment.

Fits in with the book...

...language design also tends to invite people with grand theories of everything. You have to have a certain level of naivete and idealism to think that something as large as language can be improved upon. But the book does make the point that these people do expend a good amount of energy pursuing and implementing their thoughts. So, even if they fail, we can usually learn from their failures.

My, my... we're grumpy today....

.. but then it's almost the point of the book under discussion, people who invent conlangs are "mad dreamers".

As Larry Wall said in "man perl".."The three principal virtues of a programmer are Laziness, Impatience, and Hubris."

And what greater Hubris can there be to try reinvent the very language in which we madly dream?

Alas, Program Language Design folk are no less mad, no less dreamy.. the difference is we can at least find something, even if it is only a computer, to speak our new tongue.... and then futilely try convince the world of our Rightness.

Given how many programming languages exists and how few are in wide use... alas, LtU cannot afford to sneer at the conlangers. The hard stats are that we are madder (more languages) and dreamier (with less uptake) then they.

But oh my, it's a fascinating game... digging deep into the heart of the meaning of meaning.

John, I happen to be a

John, I happen to be a programmer and a conlanger, one who frequents a forum that Edo Nyland used to hang around. The man is a nutball. Luckily he was gone long before I showed up, but you can still see his posts. He's truly insane and believes that there's this grand conspiracy of Basque monks. He's not a conlanger of any normal sort, most conlangers make conlangs for the fun and aesthetic of the act, not for some idealistic dream of blah blah blah. It's just the nutters who are the loudest and thus most visible.

I don't have much to

I don't have much to contribute to this website in terms of theoretical knowledge of programming languages, which is why I didn't register with a real name. I have been following it because it's been informative and I like functional languages and I like to keep up with the bigger picture in general. However I wanted to inform the regular contrubutors here; ones pontificating uninformedly in responce to your comment, how that, and the very mention of him here, are damaging to the credibility of this weblog.

So: guys, you look like you know a lot about programming languages, but when it comes to linguistics... Nyland is about as sane as the Time Cube guy. And about as coherent. And he is an in-joke among enthusiasts of constructed langauges.

EDIT: relevant thread about the book.
Apparently she didn't bother informing people whose languages she included examples of, which would have been nice of her. (it's fair use)


I haven't read the book, but from the critique we might summarize that there is something extremely commonsensical about natural language. Common sense is typically something we do effortlessly but can't explain, and normally wouldn't try. We can't explain it because it is wired into our brain and not accessible to high level analysis.

We can however speculate on what is going on here. It is all about analogy. Analogy permeates all human language an culture, and analogy is closely related to perception. In modern science and math there is a technology that attempts to deals with analogy. It goes by various names such as system-state modeling, Cybernetics, and to make this all really complicated we also have second order Cybernetics.

It is possible to deal with this in computer language if we really want to? There was a time when computing and computer were defined in terms of "analogy". This was the meaning of the expression "analog computer" befor the word "analog" came to mean non-digital electronics.

Analogy is an operation upon Symbols?

Quoting from another book I'm almost finished reading on A History of Mathematics, the discussion of Boole's work yields:

If any topic is presented in such a way that it consists of symbols and precise rules of operation upon these symbols, subject only to the requirement of inner consistency. this topic is part of mathematics
Perhaps in the 17th century there was the idea that the symbol manipulation (analogy) can be precise - making language and mathematics overlap. Programming Languages kind of fall in between mathematics and natural language. Though the description from language to machine can be precisely mapped, and can be considered part of mathematics, the language to human is less precise and relies on analogy.

[Edit: Intended as reply to Hank above]

What is the agenda?

Indeed, to construct "analog" or descriptive software we would use the symbol manipulation languages such as Lisp or Prolog. My post is motivated by the fact that this is practically a lost art. Or more correctly an art that has been identified and studied but is not used as far as I know. Description and analogy are the very essence of language. The math, logic. or PL theory that make it happen are tools and not the real purpose of this agenda.


To clarify this a little, there are fields that deal with representation. These fields do use symbol manipulation but don't seem to recognize analogy as an issue. This is to say that they view their field as a form of conventional programming. Symbols have precise meaning with a "formal" spin.

Analogy and Cybernetics view representation as descriptive in terms of a system-state abstraction. The product is semantic but not formal. The semantics of a system-state abstraction is well understood mathematically and is closely related to abstract algebra, or field theory in the continuous case.

We might summarize by saying that a formal view attempts to reduce everything to mathematics. The Cybernetic view is to describe in terms of a mathematically consistent model, a semantics.

Glad to see the book mentioned

I've had the pleasure of meeting Arika and so I had been anticipating the release of this book for a long time. I have been surprised to see it mentioned on so many different blogs that I read, but I guess it shouldn't have surprised me to see it here! My review (not from a programming languages point of view, nor from an Esperanto-speaking point of view, for that matter) is here.

devil's advocates: missing in action

Chris Rathman: Enlightenment: John Wilkins and his Philosophical Language are the main subject of this period. The 17th century saw the widespread adoption of mathematical conventions, and there was a belief that a language could be designed that removed ambiguity - words would convey meaning exactly as intended. That belief is still a central tenet in much PL design.

I've been lurking only, but that last sentence is very interesting -- it's a premise hard to say in words. :-) Central tenets are often subtle, unspoken assumptions in a worldview. I'm inclined to agree; it's what I find wrong with a lot of PL design: folks assume words (or code) convey exactly what's intended, or it's a reasonable goal.

I actually chased down Wilkins' writing on Philosophical Language around 1980, when I was on a long artificial language binge, which I didn't give it up completely for over ten years. I spent a lot of time framing questions about whether it's possible to unambiguously make statements without assuming unspecified context in one's language, or in conventions of folks using a language.

Of course, my eventual answer was "no." Applied to programming languages, one cannot expect a language to be completely self consistent, in terms of getting unambiguous results when used, without bringing to bear constraints found only in minds of folks using a language (or more obtusely, found only in the Popper-esque third world artifacts built by those folks).

Note I make no scholarly claims; I'm just certain about it. Now the nature of optimism folks bring to language design is itself interesting, in so far as it's consistent with a broad pattern of optimistic bias on all fronts of design to improve the world. Each person and group over-rates effective results anticipated by some gross amount that's fascinating, to me anyway. It's as if folks now rarely construct arguments as a sanity check, supposing results might be without any value, to see how it stacks up against optimistic views of success. Playing devil's advocate is out of fashion.


Concerning Arika's new book.

I think that the choice, realistically, for the future global language lies between English and Esperanto, rather than an untried project. As a native English speaker I would prefer Esperanto.

It's unfortunate, however, that only a few people know that Esperanto has become a living language.

After a short period of 121 years Esperanto is now in the top 100 languages, out of 6,800 worldwide, according to the CIA factbook.

It is the 17th most used language in Wikipedia, and in use by Skype, Firefox and Facebook. Native Esperanto speakers,(people who have used the language from birth), include George Soros, World Chess Champion Susan Polgar, Ulrich Brandenberg the new German Ambassador to NATO and Nobel Laureate Daniel Bovet.

Further arguments can be seen at Professor Piron was a translator with the United Nations in Geneva.

A glimpse of Esperanto can be seen at

It seems to be very similar

It seems to be very similar in scope to Umberto Eco's book The Search for the Perfect Language.