Lambda the Ultimate

inactiveTopic Linguistic Universals and Particulars
started 7/4/2003; 4:48:19 AM - last post 7/11/2003; 6:36:24 AM
Ehud Lamm - Linguistic Universals and Particulars  blueArrow
7/4/2003; 4:48:19 AM (reads: 2994, responses: 33)
Linguistic Universals and Particulars
Reflections, retrospective and prospective, about the activities and results of linguistics.

This paper is about the study of natural languages, but I think some of the observations made will interest LtU readers none the lsss.

For one thing, the discussion is related to the Sapir-Whorf Thesis (and we should refelct on the fact that it is not mentioned by name).

More concretely, I am interested in starting a debate about points (vi) and (vii) in section I, as related to programing languages.

What are your thoughts?


Posted to general by Ehud Lamm on 7/4/03; 4:50:19 AM

Marc Hamann - Re: Linguistic Universals and Particulars  blueArrow
7/4/2003; 9:25:58 AM (reads: 1955, responses: 1)
As someone who had a formal background in Linguistics before becoming seriously interested in PLT, I have thought a lot about the similiarities and differences between human languages and programming languages.

Certainly the thought processes that go into linguistics have helped me in studying PLT, but that is pretty abstract.

For the purposes of discussing Bach's paper (a nice piece IMO), I would have to say that most of the parallels between the two are sociological rather than theoretical. For example, there are "endangered" programming languages and for similiar socialogical reasons, i.e. most people learn languages based on social utility.

The answer to Bach's vi and vii for PLs is going to be different for one fundamental reason: human languages are used between people who seem to be able to produce communication under the most adverse conditions, whereas PLs are also used between people and machines, which by necessity requires more theory to ensure proper communication.

Note that I said also. I think that we often forget when discussing PLs that source code is a means of communication between programmers (or from the programmer to himself) as much as with the machine. IMO, the arguments for and against explicit static data-typing frequently miss this fact.

A brief note on the Sapir-Whorf question: There is some doubt among people who have studied the work of Sapir and Whorf as to whether the extreme idea that is often circulated under their names was actually what they said. Sapir and Whorf were from the tradition that became stigmatized after the Chomskyan revolution, as described in the paper, so the exaggeration of their views may have been part of anti-descriptivist propaganda. This might account for Bach not using the term.

Chris Rathman - Re: Linguistic Universals and Particulars  blueArrow
7/4/2003; 10:58:05 AM (reads: 1921, responses: 1)
Didn't the Theory come before the actual implementation of programming languages? (thinking of Church, Turing, etc....)

andrew cooke - Re: Linguistic Universals and Particulars  blueArrow
7/4/2003; 12:18:59 PM (reads: 1913, responses: 5)
hmmm. section 1 is entitled "some truisms" and 1.vii answers "do languages need linguists?" positively. i can't help thinking that the real truism here is that no-one likes to think their own particular profession pointless - i am sure that languages manage quite well without linguists.

on the need for a theory: first, any useful language must convey information. how that information is conveyed is something we can study and theorize about. so theories of languages are always possible, but - rather like linguists - not necessary. second, a reductive theory of programming languages is useful because it allows generalisations to be made (what i take "reductive" to mean). this allows abstractions in programs that manipulate programs. without some kind of structure to the language ("reductive theory" at its most liberal meaning) manipulating programs is not possible, which makes compilers difficult to implement.

i think i'm stating the obvious here so i guess i'm wondering where i've gone wrong or what was expected...?

Ehud Lamm - Re: Linguistic Universals and Particulars  blueArrow
7/4/2003; 2:15:32 PM (reads: 1949, responses: 0)
Indeed. When linguistic relativity was discussed here in the past I was careful to note that the extreme formulation ("language controls thought") is far from being the only possible reading of this very fundamental idea.

Ehud Lamm - Re: Linguistic Universals and Particulars  blueArrow
7/4/2003; 2:21:35 PM (reads: 1940, responses: 4)
Well, quite a few people object to the analogy between programming languages and "real" languages. One of the claims I've heard often is that there really isn't a need for linguistic intuition (or style, or insight) for the design of good programming languages. That this approach leads to syntactic sugar and misses out on the "important" stuff (like algebraic structure).

By the way, this is related to the discussion about the perl philosophy. Wall is quite fond of talking about linguistic insights (often wrong) that guide him in designing the language. Not many language designers are on record for making this observation repeatedly and consistently.

Marc Hamann - Re: Linguistic Universals and Particulars  blueArrow
7/4/2003; 3:06:14 PM (reads: 1978, responses: 0)
I agree about Wall's "insights", but I suppose on the language design issue I would want to make a distinction between formal semantic structure and formal syntactic structure (a distinction found in Linguistics as well).

A parallel to be drawn is a mathematical theory and the notation used to describe it.

I tend to believe (perhaps heretically) that the power of abstract mathematics is not so much that it is "like the world" but that it is "like the way people think about the world". I mean by this that it takes basic concepts that humans use for reasoning ( implication for example ) and extends it into a formal system that can break more complicated concepts into the smaller bits we can handle.

An important part of this is the ability of people to use symbols to encapsulate more complex ideas, and this finds expression as the notation used to describe the theory.

The semantics of the theory (or of the PL) must be "intuitive" in precisely the sense that a mathematician (programmer) can reason about the concepts and their combinations effectively.

The syntax (notation) of the theory (PL) must be "intuitive" in precisely the sense that the mathematician (programmer) can easily understand what semantics are being represented.

Unfortunately, "intuitive" is somewhat subjective. I, for example, find the standard notation for denotational semantics somewhat uninituitive, and hence I have to work harder to get the ideas than I think the ideas warrant. Someone else may disagree.

However, I (and people like me) might be less likely to use the "language" of DS because of this unintuitiveness, so you could argue that it is a "badly designed" language (for me anyway), even though the "algebra" of it is quite meaningful.

Isaac Gouy - Re: Linguistic Universals and Particulars  blueArrow
7/4/2003; 5:50:06 PM (reads: 1897, responses: 1)
Ever since it was used to describe UIs, "intuitive" seems to have become a euphemism. Mostly we just mean if you've already learned to do X then you shouldn't have too much trouble learning to do Y.

So if you have mathematical ability and training something's wrong if you can't figure out the semantics of the PL?

andrew cooke - Re: Linguistic Universals and Particulars  blueArrow
7/5/2003; 6:20:51 AM (reads: 1928, responses: 2)
ah ok. you're asking to what degree programming languages should share the same structures as natural languages (context in perl and adverbs in j, for example)?

one similarity, that cause problems in both, is that when people use languages they don't treat them formally. instead they make ad-hoc extensions that complicate matters for linguists and programmers. i guess the need to write a compiler helps keep a programming language designer closer to the one true path. the flip side of this argument implies that perl's source processing should be a cross-linked nightmare. since it's pretty reliable (afaik), i wonder how they manage that in practice? are the lexing, parsing and initial decoration of the ast are all mixed up together? wonder if they're using parse forests for perl6 (don't know anything about that, but sounds like it might help, at a cost in efficiency)? does this place a practical limit on just how DWIM perl can be - most rules must be based on local information, for example? i guess that's true of natural languages too, at least in practice, so maybe it's not as bad as it seems.

hmm http://www.perldoc.com/perl5.6/pod/perlguts.html#Compiled-code suggests that it's pretty standard. just how much of perl is context sensitive? looking at that outline of the process it seems pretty fixed, although i guess you could hide a lot of nastiness in the yacc level, if you were so inclined.

Ehud Lamm - Re: Linguistic Universals and Particulars  blueArrow
7/5/2003; 11:40:43 AM (reads: 1942, responses: 1)
you're asking to what degree programming languages should share the same structures as natural languages (context in perl and adverbs in j, for example)?

Well, that may be part of it, but I was asking a more general question. Are programming languages really languages? Do we gain something by using this analogy?

The linguistics may be different than those of natural languages; in fact I think this is one reason why "real" linguists should study programming languages. It will broaden their perspective.

Ehud Lamm - Re: Linguistic Universals and Particulars  blueArrow
7/5/2003; 11:42:19 AM (reads: 1852, responses: 0)
Didn't the Theory come before the actual implementation of programming languages? (thinking of Church, Turing, etc....)

I was thinking about linguistic theory, not computability theory...

Marc Hamann - Re: Linguistic Universals and Particulars  blueArrow
7/6/2003; 1:53:47 PM (reads: 1927, responses: 0)
Are programming languages really languages? Do we gain something by using this analogy?

Yes, because one of their functions is to communicate between people (programmers). What is gained is a body of knowledge that can be applied to the set of issues that they have in common.

this is one reason why "real" linguists should study programming languages. It will broaden their perspective.

Perhaps the reverse is true too. ;-)

Marc Hamann - Re: Linguistic Universals and Particulars  blueArrow
7/6/2003; 2:02:18 PM (reads: 1806, responses: 0)
Ever since it was used to describe UIs, "intuitive" seems to have become a euphemism. Mostly we just mean if you've already learned to do X then you shouldn't have too much trouble learning to do Y.

I agree with you largely, but in this case X is an almost universal human skill: language use. For good or ill, most people (even programmers) don't have that thorough a background in math. ;-)

So if you have mathematical ability and training something's wrong if you can't figure out the semantics of the PL?

A warning sign for the language designer at any rate.

If the semantics were not thought through, there can simply be gotchas that are hard to avoid. If the syntax is poor at expressing the semantics, it can simply be hard to see what is intended.

Isaac Gouy - Re: Linguistic Universals and Particulars  blueArrow
7/6/2003; 3:33:00 PM (reads: 1738, responses: 2)
to communicate between people (programmers)
In some programming paradigms much of that communication happens through the names given to the types and values and operations defined during programming. An API can be baffling simply because the names of operations didn't follow the expected usage. Smalltalk users were particularly obsessed with finding "good names" ;-)

That communication is embedded in the programming language but isn't defined by the PLs syntax and semantics.

John Carter - Re: Linguistic Universals and Particulars  blueArrow
7/6/2003; 7:33:02 PM (reads: 1722, responses: 3)
So how many reflective Fortran programs have you seen relative to Scheme hygenic macros? Or programs that write postscript? Have you seen the Lisp in Lisp or Joy in Joy self interpreting functions? Can you imagine the same for Java?

I think as soon as you consider reflective programming, you realise the Sapir-Whorf conjecture isn't a conjecture, but an obvious and well worn fact of everyday programming life.

Marc Hamann - Re: Linguistic Universals and Particulars  blueArrow
7/6/2003; 8:52:40 PM (reads: 1759, responses: 1)
That communication is embedded in the programming language but isn't defined by the PLs syntax and semantics.

It is well-known among linguists that native speakers of a language are often oblivious to fundamental structural properties of their language, but they often have elaborate (sometimes demonstrably wrong) beliefs about their language. ( E.g. I've heard both French and German speakers declare theirs is the most logical.)

I think naming is simply one linguistic phenomenon that every programmer is aware of. Most have some theory about it, though different beliefs exist, some possibly demonstrably wrong. But that doesn't mean that there aren't other linguistic phenomena going on, beneath conscious thought, that affect communication.

If I took a large chunk of Lisp code and took away all the parentheses, could even the author of the code tell what it did without a great deal of difficulty? Clearly those parentheses DO communicate something to a reader.

Semantically, if a programmer is not familiar with the idea of continuations, will call/cc communicate anything to them, even if it were spelled out in its full form?

The semantics of the PL are the ideas of the language, and the syntax is the symbolic form that represents those ideas. Without them, no communication about the meaning of the program can happen, with another person, or with a computer.

Marc Hamann - Re: Linguistic Universals and Particulars  blueArrow
7/6/2003; 9:04:56 PM (reads: 1766, responses: 2)
I think as soon as you consider reflective programming, you realise the Sapir-Whorf conjecture isn't a conjecture, but an obviously and well worn fact of everyday programming life.

I think you are confusing convenient expression with possible expression.

Any Church-Turing powerful language can do any of things you mention. For reasons of convenience and "cultural" inclinations, some of these things are horribly inconvenient to do in some of these languages, but are easy and encouraged in others.

This is parallel in human languages. There is a word in German, "schadenfreud" that has no exact English equivalent (unless you include the naturalized use of it in English ;-) ); however I can easily explain to you that it means the pleasure one takes in another's misfortune. I am not prevented from thinking the idea because my language doesn't have the word, it just takes more words to express it.

Isaac Gouy - Re: Linguistic Universals and Particulars  blueArrow
7/7/2003; 5:31:01 AM (reads: 1695, responses: 2)
naming ... that doesn't mean that there aren't other linguistic phenomena going on
Syntax and semantics are rightly the focus of discussion on LtU - it's actually naming that's the other linguistic phenomenon ;-)

It seems we discuss PLs in isolation, and we use PLs commingled with natural language and formal mathematical language.

Marc Hamann - Re: Linguistic Universals and Particulars  blueArrow
7/7/2003; 8:43:33 AM (reads: 1707, responses: 1)
Syntax and semantics are rightly the focus of discussion on LtU - it's actually naming that's the other linguistic phenomenon ;-)

My point exactly. ;-) I had understood you to say that the part of PLs that involves communicating with other programmers was mainly "good naming". I wanted to stress that syntax and semantics ain't just for machines. ;-)

It seems we discuss PLs in isolation, and we use PLs commingled with natural language and formal mathematical language.

I do think that the "sociolinguistic" context of PLs do get neglected, and that there is frequent confusion between the social virtues or failings of PLs and their technical virtues or failings.

This leads to the common belief among specific PL afficionados that only crappy languages can be popular. ;-)

Ehud Lamm - Re: Linguistic Universals and Particulars  blueArrow
7/7/2003; 9:44:11 AM (reads: 1725, responses: 0)
I've heard both French and German speakers declare theirs is the most logical

Aha. This is discussed in the Language Myths book which was discussed at lentgth on LtU.

Ehud Lamm - Re: Linguistic Universals and Particulars  blueArrow
7/7/2003; 9:49:34 AM (reads: 1732, responses: 0)
I do think that the "sociolinguistic" context of PLs do get neglected, and that there is frequent confusion between the social virtues or failings of PLs and their technical virtues or failings.

That's what LtU is for. To make people appreciate the diverse factors that are invovled in "language success".

John Carter - Re: Linguistic Universals and Particulars  blueArrow
7/7/2003; 6:14:12 PM (reads: 1709, responses: 1)
I think you are confusing convenient expression with possible expression.

See The Sapir-Whorf Hypothesis

Then you will see I'm talking about "Weak determinism" and not confusing anything.

On the other hand it's only "weak" in the Mathematical sense. The tendency for programmers of non-reflective languages to choose non-reflective solutions is very strong.

Marc Hamann - Re: Linguistic Universals and Particulars  blueArrow
7/7/2003; 7:08:15 PM (reads: 1732, responses: 0)
See The Sapir-Whorf Hypothesis

Even the "Weak determinism" thesis seems pretty weak to me. (pardon the pun. ;-) )

People in England, Jamaica and Ireland (to take a small sample) all speak English, but the cultural values of each group is quite distinct. It isn't the structure of the language that gives each group it's indentity, but rather the culture (in the anthropological sense) that it expresses through its language.

The tendency for programmers of non-reflective languages to choose non-reflective solutions is very strong.

Is this because their thought patterns are disrupted by the language or because either a) the language makes it awkward to express it or b) the language community discourages it?

Frank Atanassow - Re: Linguistic Universals and Particulars  blueArrow
7/8/2003; 6:08:35 AM (reads: 1612, responses: 1)
Well, quite a few people object to the analogy between programming languages and "real" languages.

I used to believe this because I think programs should be unambiguous, and I thought natural language is ambiguous, hence unsuited for program development. But then I took a look at some of the linguistics literature, and particularly stuff related to Lambek calculus, and found that:

  • formal calculi like typed lambda-calculus have been quite successfully used to describe natural language semantics,

  • one can construe the ambiguity of, say, a sentence to be part of the meaning of the sentence. Then the ambiguity of natural language is not so dissimilar from the (intentional) ambiguity one gets from programming language features like polymorphism.

(I can post some references to interesting papers on formal calculi used in linguistics if people are interested.)

This is not to say that I think there is no unintended ambiguity in the utterances of most speakers, nor to say that I think programs should be written in natural language. Only that the differences between natural and formal language are much smaller than I imagined.

IMO, the problem with Larry Wall and Perl advocates who make claims about Perl's superiority based on some supposed relation to natural language is that I think that relation is very shallow and superficial, largely on the syntactic level. For example, I'm not sure but, the context-sensitivity in Perl seems to be largely a mechanism for abbreviation rather than a semantic feature. At least it seems to be used that way.

All the claims about Perl's supposed linguistic underpinnings, in fact, are extremely vague and nebulous, not the sort of thing one expects to hear from a trained linguist. Linguists are scholars and scientists, and scholars are expected to back up claims with concrete evidence using modern tools and techniques. But I have never seen any serious analysis of Perl using such techniques, which I find suspicious.

Marc Hamann - Re: Linguistic Universals and Particulars  blueArrow
7/8/2003; 3:48:30 PM (reads: 1615, responses: 0)
(I can post some references to interesting papers on formal calculi used in linguistics if people are interested.)

Are these mainly related to Categorial Grammar (Lambek, Steedman, Carpenter, etc.)?

All the claims about Perl's supposed linguistic underpinnings, in fact, are extremely vague and nebulous, not the sort of thing one expects to hear from a trained linguist.

You might be surprised by the work of some trained linguists then. ;-)

My biggest critique of Perl's "natural language features", setting aside the fact that they have no sound basis ;-), is that they are precisely those features of natural language that REDUCE communication. (vagueness, context dependency, terseness, etc.)

Natural languages are not used exclusively to communicate information, but also to process emotion and form social bonds, etc. It is in these areas that vagueness et al. are useful.

Since PLs are strictly for communicating information, only those features of natural languages that ENHANCE clarity and communication are of relevance.

Larry Wall might like the idea of conversing with a computer "buddy" using Perl, but I think the idea is premature until HAL 9000 is around. ;-)

Chris Rathman - Re: Linguistic Universals and Particulars  blueArrow
7/8/2003; 5:09:29 PM (reads: 1556, responses: 0)
I can't say that I like Perl, but I don't think Wall's argument has to do with the Englishness of Perl, so much as the manner in which spoken languages evolve. His argument is that Perl evolved in the wild, being a bottom up sort of design with a combination of several different idioms. This method of evolution results in the programming language having multiple ways to do the same thing.

Frank Atanassow - Re: Linguistic Universals and Particulars  blueArrow
7/9/2003; 6:18:37 AM (reads: 1552, responses: 1)
Are these mainly related to Categorial Grammar (Lambek, Steedman, Carpenter, etc.)?

Yes. Also this one:

Using Types to Parse Natural Language
Mark P. Jones, Paul Hudak, Sebastian Shaumyan

What I found really startling was that higher-orderness seems to occur naturally in English. For example, in the sentence (where brackets are used to indicate structure):

[I dislike], but [Dexter thinks that Warren likes] [these flowers].

the two clauses

I dislike X.
Dexter thinks that Warren likes X.

are used by currying them with respect to the noun phrase parameter X. Note that using a standard analysis you would be able to assign a type (S) to I dislike X but not I dislike alone.

Steedman says about this:

This is a surprising property, because it seems to flout all received opinion concerning the surface structure of sentences, suggesting that a structure in which objects...dominate subjects is as valid as the standard one in which subjects dominate objects. (MIT Encyclopedia of Cognitive Sciences, "Categorial Grammar")

Marc Hamann - Re: Linguistic Universals and Particulars  blueArrow
7/9/2003; 9:53:46 AM (reads: 1583, responses: 0)
Using Types to Parse Natural Language

Interesting paper. Thanks Frank!

Categorial Grammar has moved into the position of "most promising" syntactic theory for me recently. (Steedman's most recent book tilted me over)

It is too bad that "mainstream" linguistics (at least in North America) has paid relatively little attention to it.

An interesting point from the paper that bears on "subconscious" aspects of notation that applies to PLs too.

They mention the problem with the parser identifying "New Haven" as a single constituent, but this is really an accident of English orthography. It is easy for people to forget that a "word" is not necessarily defined by spaces between letter sequences.

If one assumes some kind of orthographic lexing before inputing to the parser, this "problem" goes away.

Isaac Gouy - Re: Linguistic Universals and Particulars  blueArrow
7/10/2003; 9:16:53 AM (reads: 1504, responses: 1)
There is an effect of culture on thought independent of language. We know this because both the coordinate Chinese speakers and the compound Chinese speakers group words differently from Americans regardless of language of testing. The differences between coordinate and compound speakers also indicate a culture difference independent of language. The compound speakers from Westernized regions are shifted in a Western direction - and to the same extent regardless of language of testing. There is also clearly an effect of language independent of culture - but only for the coordinate speakers from China and Taiwan. They respond very differently depending on whether they are tested in Chinese or in English.

A tentative answer to the Sapir-Whorf question as it relates to our work ... is that language does indeed influence thought so long as different languages are plausibly associated with different systems of representation.

p161-2 The Geography of Thought, Richard E Nisbett

(Coordinate bilingual speakers learned a second language relatively late in life; compound bilingual speakers learned a second language early and use it in many contexts.)

The study doesn't seem to have been published yet:
Ji, L., Zhang, Z., & Nisbett, R. E. Culture, Language and Categorization. Unpublished manuscript, Queens University.

The study is also referred to in "Spontaneous Attention to Word Content Versus Emotional Tone: Differences Among Three Cultures", and maybe "Is it culture, or is it language?".

Marc Hamann - Re: Linguistic Universals and Particulars  blueArrow
7/10/2003; 10:29:55 AM (reads: 1517, responses: 0)
There is an effect of culture on thought independent of language.

Thanks for the reference, Isaac.

Even though they support my position, I'm going to critique the findings anyway. ;-)

Referring to: "Spontaneous Attention to Word Content Versus Emotional Tone: Differences Among Three Cultures"

To be honest, I find these psych type experiments a little bit questionable at the best of times, but I will accept their evidence at face value and argue against their conclusions.

First, the mechanisms of intonational expression are known to vary widely from language to language and even from dialect to dialect. Within the unilingual groups, they don't really eliminate the possiblity that US English and Japanese might have structural differences with respect to intonation that might make it harder to distinguish intonational content on individual words.

The fact that so-called "coordinate bilinguals" show similar results to cultural brethren does not necessarily exclude this either, since Hong Kong English and Filipino English are each highly influenced by the native languages of the region they are in. The intonational cues might be the same (I don't know that this is so, but the study doesn't eliminate it.)

In a desparate (and lame) attempt to keep this on topic to PLs ;-), I will suggest that this might be a lesson to us about examining the socio-cultural aspects of programming: the details are very hard to get at objectively, and often lurk in subtle subconscious factors that are difficult to isolate.

Isaac Gouy - Re: Linguistic Universals and Particulars  blueArrow
7/10/2003; 11:33:33 AM (reads: 1478, responses: 0)
Referring to: "Spontaneous Attention to Word Content Versus...
Notice that this is a very different study than Ji, Nisbett and Zhang's work - which seems to have been based on grouping concepts (and has nothing to do with spoken language). Maybe the book is in a nearby library ;-)

Oleg - Re: Linguistic Universals and Particulars  blueArrow
7/10/2003; 11:42:15 AM (reads: 1498, responses: 1)
What I found really startling was that higher-orderness seems to occur naturally in English.

Isn't that quite common? Many phrases that are used in apposition are actually higher-order functions:

  being humorous
  I thought,
  which I did not believe

For example, "being humorous" is \X.X is humorous. The phrase receives the meaning only when it is applied to X. That X can occur anywhere:

   "This book, being humorous, reads well"
   "I thought that, being humorous, the book will sell well"
In the latter phrase, "being humorous" refers to a book rather than to me. As we can see, the X for \X.X is humorous can come either before or after the phrase. BTW, the AGFL NLP parser [http://www.cs.kun.nl/agfl/npx/index.html] handled the latter phrase well.

Nominalizers -- phrases that convert a sentence into a noun phrase so we can make statements about it -- seem to be combinators. For example, "the fact that".

  "The fact that the commission issued that report is surprising."
Also, many phrases that add modality can be treated as higher-order functions. For example, "I fear" as in
  "Your remark, I fear, is a bit rough."

It seems that higher-order functions are particularly pervasive in Japanese. Many particles (e.g., node), numerous nominalizers (koto, mono), modality phrases (nodesu, noka, to omoimasu) are all combinators. Furthermore, Japanese is essentially a postfix language like Forth or Joy. Every sentence (a simple sentence such as a verb or an adjective, or a noun phrase with the corresponding particle) may be considered as a STACK->STACK, where STACK is the context of the phrase or a discourse. It's interesting that Japanese has specific functions to explicitly manipulate the stack: particles wa and mo. The former acts like "pop" that removes the "top" of the stack and pushes the new phrase onto it. "mo" adds to the context.

Frank Atanassow - Re: Linguistic Universals and Particulars  blueArrow
7/11/2003; 6:36:24 AM (reads: 1464, responses: 0)
I'm not an expert on this NL stuff, but:

Isn't that quite common?

I think you are missing the point. In a lexicalized categorial grammar, every word could be assigned a functional type. Even noun phrases can be lifted to 1->NP, because 1->NP and NP are interderivable. Even with the usual type assignments, though, you do not even need to consider something so exotic as being humorous; any verb, adjective, adverb, etc. denotes a function.

The point (as I understand it) is rather that the conventional way to analyze a sentence is hierarchical, by means of something like, say, a context-free grammar: S -> NP IV | NP TV NP. This imposes a global tree structure on sentences, so you can only talk about non-standard structures at the semantic, rather than syntactic level.

Also, since -> obeys the law a * b -> c iff b -> (a -> c), it lets you avoid adding an infinite number of rules to the grammar. For example, to analyze all phrases of the variety:

[I dislike], but [Dexter thinks that Warren likes] these flowers.

you would need to add a new syntactic category not only for NP/S, but also for every formula derivable from NP/S. And then there are things like `type-raising' which can be iterated any number of times.

But the point Steedman was making was that introducing function types allows surprising, non-standard analyses of the structure at the syntactic level. For example, the denotations of adjacent lexemes of function type can be concatenated by function composition, but moreover that function is denoted by a non-standard surface structure: witness [Dexter thinks that Warren likes] [flowers] vs. [Dexter thinks [that [Warren likes flowers]]].

Marc Hamann - Re: Linguistic Universals and Particulars  blueArrow
7/11/2003; 7:56:47 AM (reads: 1493, responses: 0)
Isn't that quite common? Many phrases that are used in apposition are actually higher-order functions:

Frank's take is right.

Though it is not unusual to think about the semantics of phrases or words as functions, many of the more popular theories of syntax do NOT.

Categorial Grammar is the only one I know that explicitly and centrally regards syntactic structures as type functions.