User loginNavigation |
GeneralCERT C Secure Coding StandardFrom SC-L:
Hey, maybe this is also relevant for the curriculum thread. When Are Two Algorithms the Same?
When Are Two Algorithms the Same? Andreas Blass, Nachum Dershowitz, Yuri Gurevich. February 2008
People usually regard algorithms as more abstract than the programs that implement them. The natural way to formalize this idea is that algorithms are equivalence classes of programs with respect to a suitable equivalence relation. We argue that no such equivalence relation exists. A bit more philosophical than usual, but the issue is quite relevant to discussions in the field. It is possible to stipulate any equivalence relation that is considered useful (e.g., equivalence up to local transformations) but the notion of a universally applicable relation is indeed problematic. STEPS Toward The Reinvention of Programming: First Year Progress ReportViewpoints Research have published the first year's status report on their previously discussed project. Programmers At WorkVia Scott Rosenberg (whose book Dreaming in Code I think we mentioned here before), I learn than Susan Lammers is making the interviews from her 1984 book Programmers At Work available on the Web.
Here is how she describes the goals of her new site:
Who better than the LtU community to contribute to such a conversation? The first PAW interview posted to the site is the interview with Charles Simonyi, a man whose views of the future of programming, and programming languages, are mentioned here often (though not always with great enthusiasm). Arc is releasedMake of it what you will, but Arc is now officially released.
This part of Graham's announcement is a gem:
This sure made me smile... Prediction for 2008So, what are your prediction for 2008? Naturally, we are only interested with predictions related to programming languages... Three types of predictions are thus in order: (1) Predictions about PLT research (direction, fads, major results) (2) Predictions about programming languages (whether about specific languages, or about families etc.) and (3) Predictions about industrial use of languages/language-inspired techniques (adoption, popularity). OCaml Light: A Formal Semantics For a Substantial Subset of the Objective Caml LanguageOCaml Light: a formal semantics for a substantial subset of the Objective Caml language.
From a team including Peter Sewell (Acute, HashCaml, Ott). I continue to believe that things are heating up nicely in mechanized metatheory, which, in the multicore/multiprocessor world in which we now live, is extremely good news. By Paul Snively at 2007-11-26 18:33 | Functional | General | Implementation | Object-Functional | Semantics | Theory | Type Theory | 2 comments | other blogs | 10671 reads
Online Learning of Relaxed CCG Grammars for Parsing to Logical FormOnline Learning of Relaxed CCG Grammars for Parsing to Logical Form, Luke S. Zettlemoyer and Michael Collins. Empirical Methods in Natural Language Processing and Computational Natural Language Learning, 2007.
This paper isn't exactly PL, though Ehud has been okay with the odd foray into computational linguistics before. I thought it was interesting to see machine learning work make use a typed formalism like categorial grammars to handle long-range dependencies, and it leaves me wondering if it's possible to set these kinds of techniques onto program analysis problems. One neat thing about the CCG formalism is that you have parsing, which is described more or less as a typechecking problem, and separately you have the semantic constraints, which are basically lambda-calculus terms that build up a term in first-order logic. So I can imagine doing something like writing down how you're supposed to use a library as semantic constraints that add up to a Prolog program, and then at compile time the compiler walks the typing derivation to construct the program, and then runs it to figure out if you're doing something dumb (according to the library designers). I don't know if that actually makes sense, but this work certainly prompted me to think. Gödel, Nagel, minds and machines Solomon Feferman. Gödel, Nagel, minds and machines. Ernest Nagel Lecture, Columbia University, Sept. 27, 2007.
This is not directly PLT related, and more philosophical than what we usually discuss on LtU, but I think it will be of interest to some members of the community. While the historical details are interesting, I am not sure I agree with the analysis. It would be interesting to here what others make of this. To make this item slightly more relevant to LtU, let me point out that both the LC and category theory are mentioned (although they are really discussed only in the references). By Ehud Lamm at 2007-10-25 23:46 | General | History | Lambda Calculus | 62 comments | other blogs | 18220 reads
Privacy and Contextual Integrity: Framework and ApplicationsPrivacy and Contextual Integrity: Framework and Applications, A. Barth, A. Datta, J.C. Mitchell, and H. Nissenbaum. Proceedings of the IEEE Symposium on Security and Privacy, May 2006.
Contextual integrity is a part of a philosophical theory of privacy developed by the philosopher Helen Nissenbaum, and it's very neat to see it being applied to develop machine-checkable access-control formalisms. By neelk at 2007-10-18 01:15 | General | login or register to post comments | other blogs | 6099 reads
|
Browse archives
Active forum topics |
Recent comments
21 hours 20 min ago
1 day 18 hours ago
2 days 22 hours ago
2 days 23 hours ago
1 week 23 hours ago
1 week 1 day ago
1 week 1 day ago
4 weeks 1 day ago
5 weeks 53 min ago
5 weeks 6 hours ago