Lambda the Ultimate

inactiveTopic Linguistic side effects
started 3/1/2003; 9:11:31 AM - last post 3/3/2003; 7:57:20 PM
Ken Shan - Linguistic side effects  blueArrow
3/1/2003; 9:11:31 AM (reads: 1966, responses: 5)
Linguistic side effects
I gave this talk yesterday at the New England Programming Languages and Systems symposium.
As a natural language semanticist, I strive to scientifically explain why "every student passed" entails "every diligent student passed", why "a man is mugged every 11 seconds" is ambiguous, and why "nobody asked any question" sounds better than "everybody asked any question". Making a linguistic theory is like specifying a programming language: one typically devises a type system to characterize what utterances are acceptable, and a denotational semantics to explain which statements entail which other ones. Along this connection, programming language research can inform linguistic theory and vice versa; in particular, computational side effects are intimately related to referential opacity in natural languages. In this talk, I will illustrate this link by using continuations and composable contexts to analyze quantification (as in "every student passed"). No prior knowledge of linguistics will be assumed. (This talk describes joint work with Chris Barker and Stuart Shieber.)
I am posting this item for two reasons: First, to solicit help from type theorists and programming language researchers interested in natural language. Second, to apologize as a Lambda editor for posting nothing over the past year and explain what I have been up to. I should now work on passing my quals...

Related work: Aarne Ranta's Grammatical Framework.
Posted to general by Ken Shan on 3/1/03; 9:13:48 AM

Ehud Lamm - Re: Linguistic side effects  blueArrow
3/1/2003; 10:10:21 AM (reads: 1108, responses: 0)
Welcome back...

Ehud Lamm - Re: Linguistic side effects  blueArrow
3/3/2003; 4:07:58 AM (reads: 1042, responses: 1)
I found the slides interesting, so I wonder if there's a write up of these ideas available (a paper, tech report etc.)

Oleg - Re: Linguistic side effects  blueArrow
3/3/2003; 7:54:24 PM (reads: 1025, responses: 0)
A textbook "Language, Proof and Logic" by John Barwise and John Etchemendy (in collaboration with Gerard Allwein, Dave Barker-Plummer and Albert Liu)
http://www-csli.stanford.edu/hp/LPL.html
(was) available online at http://books.pdox.net/Math/Language%20Proof%20and%20Logic.pdf

devotes several chapters to translation from English into the language of the first-order predicate logic. The book explains in great detail peculiarities of translating 'each' and 'some' sentences. Sentences such as "more than a half of the students liked the class" cannot be expressed with the traditional quantifiers. The book explains why (the determiner "more than a half" is irreducible) and what to do about it. The book also examines the differences between the logical implication and a conversational implicature.

Incidentally, the book discusses the ambiguity of a sentence like the following:
Every minute a man is mugged in New York City.

There are two translations:

  a weak reading:
      all x. (Minute(x)  -> exists y. (Man(y) & MuggedDuring(y, x)))

a strong reading: exists y.( Man(y) & all x.(Minute(x) -> MuggedDuring(y,x)))

The weak reading says that every minute some man is robbed -- not necessarily the same man. The second reading specifically says that there is an unfortunate man who is being mugged minute after minute. The readings are called 'weak' and 'strong' because the latter implies the former, but not vice versa.

In the above case, the weak reading is more natural (and the one that first comes to mind). The book points out the following "Saturday Night Live" joke that brings up the strong reading:
"Every minute a man is mugged in New York City. We are going to interview him tonight."

The gist of the joke is this ambiguity and the low likelihood of the strong reading.

More interesting examples (of so-called "Donkey sentences")
"Every cube in back of a dodecahedron is also smaller than it."

The 'it' is especially troublesome. The translation therefore looks nothing like the original sentence:

      all x.(Dodec(x) -> all y.((Cube(y) & BackOf(x,y)) -> Smaller(y,x)))

"If you always do right, you will gratify some people and astonish the rest."

   all x.(RightThing(x) -> DoThing(you,x)) -> 
      exists y. (Person(y) & Gratify(you,y)) &
      all y.(Person(y) -> Gratify(you,y) || Astonish(you,y)))

Oleg - Re: Linguistic side effects  blueArrow
3/3/2003; 7:57:20 PM (reads: 1016, responses: 0)
The talk discusses in detail the reasoning with contexts and call/cc. Perhaps I can point out two results that I have encountered recently:

  (call/cc ..... (call/cc call/cc))      <=v=> (lambda (p) (p p))
  (call/cc ..... (call/cc (call/cc id))) <=v=> (lambda (p) (p p))
where call/cc ... means any number of applications of call/cc (from 0 upwards). <=v=> means observational equivalence under a call-by-value strategy. In proving this theorem, I found reasoning with contexts a bit confusing, and a CPS translation more helpful.

Since the self-application (lambda (p) (p p)) is the "core" of the fixed-point combinator, it follows that (call/cc call/cc) can be used to express Y. It was pointed out that Andrzej Filinski has already done that in 1994:
Recursion from Iteration
http://citeseer.nj.nec.com/filinski94recursion.html

A more recent work is by Masahito Hasegawa and Yoshihiko Kakutani "Axioms for Recursion in Call-by-Value"
http://www.kurims.kyoto-u.ac.jp/~hassei/papers/hosc.html

Incidentally, it seems I can derive Filinski's results using a bit different line of reasoning and motivation.

Ken Shan - Re: Linguistic side effects  blueArrow
4/19/2003; 11:18:53 PM (reads: 896, responses: 0)
Hello again. (:

There is now, finally, a write up available.