User loginNavigation |
Logic/DeclarativeLinear Logical AlgorithmsLinear Logical Algorithms, Robert J. Simmons and Frank Pfenning, 2008.
In my last post, I linked to a paper by Ganzinger and McAllester about specifying algorithms as logic programs, and a) admired how concise and natural the programs were, and b) was sad that the logic programs used some "non-logical" operations, such as deletion. So, what does it mean for an operation to be "non-logical", and why is it a bad thing? Roughly speaking, you can think of the analogy: non-logical operations are to logic programs what impure operations are to functional programs -- they are features which break some parts of the equational theory of the language. Now, the Curry-Howard correspondence for functional programs says that types are propositions, and programs are proofs. It turns out that a different version of this correspondence holds for logic programs: in logic programming, a set of propositions is a program, and the execution of a program corresponds to a process of proof search -- you get a success when execution finds a proof of the goal. When you have nonlogical operations in your logic programming language, you've introduced operators that don't correspond to valid rules of inference, so even if your logic program succeeds, the success might not correspond to a real proof. Deletion of facts from a database is a good example of a nonlogical operation. Regular intuitionistic and classical logic is monotonic: deduction from premises can only learn new facts, it can never disprove things you already knew to be true. Since deletion removes facts from the set of things you know, it can't have a logical interpretation in classical/intuitionistic logic. However, it turns out that not all logics are monotonic, and in this paper Simmons and Pfenning show that if you take the language of propositions to be a fragment of linear logic, then all of the operations that Ganzinger and McAllester use actually do have a nice logical interpretation. Logical AlgorithmsLogical Algorithms, Harald Ganzinger and David McAllester. ICALP 2002.
So, there are two main styles in logic programming. The first is Prolog-style goal-directed, or backwards, search. The idea is that you have a set of rules, and a goal, and you nondeterministically choose rules that might have proven that goal, trying to find a sequence of deductions that could have proven this goal. It's called backwards search since you are trying to reason backwards from the goal towards a full proof. The other style is, naturally, called forwards search (confusingly, this is also called the inverse method in theorem proving). The idea is that you have a goal, and some rules, and a starting set of facts. You then apply the rules to the facts you have, enlarging your database of facts and enabling more deductions. You keep doing this until either you discover the goal you were trying to prove in the database of facts, or the database saturates (ie, no more deductions are provable) and the goal is unprovable. The idea is that your database is an implicit data structure, which you update as part of the search. This makes forwards search a particularly natural method when you're trying to compute closures -- graph algorithms, dataflow analyses, that kind of thing. While we've discussed applications of forward logic programming before, I thought it might be good to link to a discussion of the methodology of how to specify algorithms as forward logic programs and analyze their complexity. The language in this paper permits deletion of facts from the database, which is unfortunately a non-logical operation -- in clasical and intuitionistic logic, deduction can only increase the number of facts you know. But with a change of logic, it can be made logical. That'll be the next paper I post a link to. :) History of Logic Programming: What went wrong, What was done about it, and What it might mean for the futureCarl Hewitt is speaking tomorrow at Stanford's CSLI CogLunch on the history of logic programming. A paper is here, so LtU readers can offer their perspectives on the argument. By Ehud Lamm at 2008-05-07 23:03 | History | Logic/Declarative | 50 comments | other blogs | 45852 reads
Pure, Declarative, and Constructive Arithmetic Relations
Pure, Declarative, and Constructive Arithmetic Relations. Oleg Kiselyov, William E. Byrd, Daniel P. Friedman, and Chung-chieh Shan. FLOPS 2008. (source code)
We present decidable logic programs for addition, multiplication, division with remainder, exponentiation, and logarithm with remainder over the unbounded domain of natural numbers. Our predicates represent relations without mode restrictions or annotations. They are fully decidable under the common, DFS-like, SLD resolution strategy of Prolog or under an interleaving refinement of DFS... So you've read The Reasoned Schemer and were excited about the fact that unlike the built-in operations in Prolog, arithmetic relations (over binary numbers) were fully implemented. For example, addition could also be used for subtraction and multiplication for factoring numbers and for generating all triples of numbers related by multiplication. Now comes this paper to explain the motivation behind some of the more arcane definitions needed to implement arithmetic in a fully relational style, and to prove their properties formally. The paper develops unary and binary arithmetic relations in pure Prolog (with no cuts, negation or introspection). LtU readers will also be interested in yet another embedding of pure Prolog into Haskell, that the authors offer. It is not meant to be the most optimal or convenient Prolog implementation (it wasn't even intended to be an implementation of a logic system). It was explicitly designed to be easier to reason about and so help prove certain properties of SLD or similar evaluation strategies. The main difference of DefinitionTree from other embeddings of Prolog in Haskell has to do with the generation of fresh names for logic variables. In DefinitionTree, name generation is not an effect, and the naming is fully decoupled from the evaluation. The evaluation no longer needs to carry a state for the generation of fresh names, hence the evaluator is easier to reason about equationally. NEXCEL, a Deductive SpreadsheetNEXCEL, a Deductive Spreadsheet, Iliano Cervesato. 2006.
This is a neat paper about using Datalog-style relations to extend spreadsheets with some deductive database features. It seems like Datalog represents a real sweet spot in the design space for logic programming -- I've seen a lot of people put it to effective use. By neelk at 2007-11-10 17:12 | DSL | Logic/Declarative | login or register to post comments | other blogs | 9355 reads
The End of an Architectural Era (It’s Time for a Complete Rewrite)The End of an Architectural Era (It’s Time for a Complete Rewrite). Michael Stonebraker, Samuel Madden, Daniel J. Abadi, Stavros Harizopoulos, Nabil Hachem, Pat Helland. VLDB 2007. A not directly PL-related paper about a new database architecture, but the authors provide some interesting and possibly controversial perspectives:
The somewhat performance-focused abstract:
A critical comment by Amazon's CTO, Werner Vogels. By Manuel J. Simoni at 2007-10-19 13:46 | DSL | Implementation | Logic/Declarative | Ruby | 22 comments | other blogs | 24321 reads
Escape from Zurg: An Exercise in Logic ProgrammingEscape from Zurg: An Exercise in Logic Programming by Martin Erwig. Journal of Functional Programming, Vol. 14, No. 3, 253-261, 2004
By Andris Birkmanis at 2007-09-01 15:04 | Functional | Logic/Declarative | Teaching & Learning | 21 comments | other blogs | 17164 reads
Theory and Practice of Constraint Handling RulesTheory and Practice of Constraint Handling Rules, Thom Fruewirth, Journal of Logic Programming, 1994.
In the last post, we had some requests for constraint programming, so here you go. Constraint solving programs are often essentially stateful algorithms, and I see CHR as a particularly nice way of handling all that state in a declarative way. (They have a very pretty semantics as proof search in linear logic, too.) By neelk at 2007-08-25 09:04 | Logic/Declarative | Semantics | 3 comments | other blogs | 7671 reads
Cloning-Based Context-Sensitive Pointer Alias Analysis Using Binary Decision DiagramsCloning-Based Context-Sensitive Pointer Alias Analysis Using Binary Decision Diagrams, John Whaley and Monica S. Lam. PLDI 2004.
Binary decision diagrams are one of the coolest data structures of the last 20 years, and are also one of the basic enabling data structures underlying modern SAT solvers. Using them to implement Datalog is a really clever idea. EDIT: I relied on memory instead of Google, and got it wrong about BDDs and SAT solving. Modern DPLL-based SAT solvers generally do not use BDDs, because when your solutions are far apart in the solution space the representing BDD would get quite large. BDDs are widely used in verification, though, and are also still one my favorite data structures. :) By neelk at 2007-08-08 08:12 | DSL | Implementation | Logic/Declarative | 4 comments | other blogs | 9165 reads
CLL: A Concurrent Language Built from Logical Principles
CLL: A Concurrent Language Built from Logical Principles by Deepak Garg, 2005.
In this report, we use both the Curry-Howard isomorphism and proof-search to design a concurrent programming language from logical principles. ... Our underlying logic is a first-order intuitionistic linear logic where all right synchronous connectives are restricted to a monad.Yet another example of using monads to embed effectful computations into a pure FP. Another interesting part is the methodology of derivation of a PL from a logic. By Andris Birkmanis at 2007-06-30 18:06 | Logic/Declarative | Semantics | Type Theory | 2 comments | other blogs | 12107 reads
|
Browse archives
Active forum topics
|
Recent comments
14 weeks 1 day ago
18 weeks 3 days ago
20 weeks 5 hours ago
20 weeks 5 hours ago
22 weeks 5 days ago
27 weeks 2 days ago
27 weeks 2 days ago
27 weeks 5 days ago
27 weeks 5 days ago
30 weeks 4 days ago