User loginNavigation 
Online Learning of Relaxed CCG Grammars for Parsing to Logical FormOnline Learning of Relaxed CCG Grammars for Parsing to Logical Form, Luke S. Zettlemoyer and Michael Collins. Empirical Methods in Natural Language Processing and Computational Natural Language Learning, 2007.
This paper isn't exactly PL, though Ehud has been okay with the odd foray into computational linguistics before. I thought it was interesting to see machine learning work make use a typed formalism like categorial grammars to handle longrange dependencies, and it leaves me wondering if it's possible to set these kinds of techniques onto program analysis problems. One neat thing about the CCG formalism is that you have parsing, which is described more or less as a typechecking problem, and separately you have the semantic constraints, which are basically lambdacalculus terms that build up a term in firstorder logic. So I can imagine doing something like writing down how you're supposed to use a library as semantic constraints that add up to a Prolog program, and then at compile time the compiler walks the typing derivation to construct the program, and then runs it to figure out if you're doing something dumb (according to the library designers). I don't know if that actually makes sense, but this work certainly prompted me to think. 
Browse archivesActive forum topics
New forum topics

Recent comments
2 days 10 hours ago
4 days 5 hours ago
2 weeks 1 day ago
3 weeks 10 hours ago
4 weeks 8 hours ago
5 weeks 5 days ago
6 weeks 1 day ago
7 weeks 2 days ago
7 weeks 5 days ago
7 weeks 5 days ago