Logic/Declarative

Linear Logical Algorithms

Linear Logical Algorithms, Robert J. Simmons and Frank Pfenning, 2008.

Bottom-up logic programming can be used to declaratively specify many algorithms in a succinct and natural way, and McAllester and Ganzinger have shown that it is possible to define a cost semantics that enables reasoning about the running time of algorithms written as inference rules. Previous work with the programming language Lollimon demonstrates the expressive power of logic programming with linear logic in describing algorithms that have imperative elements or that must repeatedly make mutually exclusive choices. In this paper, we identify a bottom-up logic programming language based on linear logic that is amenable to efficient execution and describe a novel cost semantics that can be used for complexity analysis of algorithms expressed in linear logic.

In my last post, I linked to a paper by Ganzinger and McAllester about specifying algorithms as logic programs, and a) admired how concise and natural the programs were, and b) was sad that the logic programs used some "non-logical" operations, such as deletion.

So, what does it mean for an operation to be "non-logical", and why is it a bad thing? Roughly speaking, you can think of the analogy: non-logical operations are to logic programs what impure operations are to functional programs -- they are features which break some parts of the equational theory of the language. Now, the Curry-Howard correspondence for functional programs says that types are propositions, and programs are proofs. It turns out that a different version of this correspondence holds for logic programs: in logic programming, a set of propositions is a program, and the execution of a program corresponds to a process of proof search -- you get a success when execution finds a proof of the goal.

When you have nonlogical operations in your logic programming language, you've introduced operators that don't correspond to valid rules of inference, so even if your logic program succeeds, the success might not correspond to a real proof. Deletion of facts from a database is a good example of a nonlogical operation. Regular intuitionistic and classical logic is monotonic: deduction from premises can only learn new facts, it can never disprove things you already knew to be true. Since deletion removes facts from the set of things you know, it can't have a logical interpretation in classical/intuitionistic logic.

However, it turns out that not all logics are monotonic, and in this paper Simmons and Pfenning show that if you take the language of propositions to be a fragment of linear logic, then all of the operations that Ganzinger and McAllester use actually do have a nice logical interpretation.

Logical Algorithms

Logical Algorithms, Harald Ganzinger and David McAllester. ICALP 2002.

It is widely accepted that many algorithms can be concisely and clearly expressed as logical inference rules. However, logic programming has been inappropriate for the study of the running time of algorithms because there has not been a clear and precise model of the run time of a logic program. We present a logic programming model of computation appropriate for the study of the run time of a wide variety of algorithms.

So, there are two main styles in logic programming. The first is Prolog-style goal-directed, or backwards, search. The idea is that you have a set of rules, and a goal, and you nondeterministically choose rules that might have proven that goal, trying to find a sequence of deductions that could have proven this goal. It's called backwards search since you are trying to reason backwards from the goal towards a full proof.

The other style is, naturally, called forwards search (confusingly, this is also called the inverse method in theorem proving). The idea is that you have a goal, and some rules, and a starting set of facts. You then apply the rules to the facts you have, enlarging your database of facts and enabling more deductions. You keep doing this until either you discover the goal you were trying to prove in the database of facts, or the database saturates (ie, no more deductions are provable) and the goal is unprovable. The idea is that your database is an implicit data structure, which you update as part of the search. This makes forwards search a particularly natural method when you're trying to compute closures -- graph algorithms, dataflow analyses, that kind of thing.

While we've discussed applications of forward logic programming before, I thought it might be good to link to a discussion of the methodology of how to specify algorithms as forward logic programs and analyze their complexity.

The language in this paper permits deletion of facts from the database, which is unfortunately a non-logical operation -- in clasical and intuitionistic logic, deduction can only increase the number of facts you know. But with a change of logic, it can be made logical. That'll be the next paper I post a link to. :)

History of Logic Programming: What went wrong, What was done about it, and What it might mean for the future

Carl Hewitt is speaking tomorrow at Stanford's CSLI CogLunch on the history of logic programming.

A paper is here, so LtU readers can offer their perspectives on the argument.

Pure, Declarative, and Constructive Arithmetic Relations

Pure, Declarative, and Constructive Arithmetic Relations. Oleg Kiselyov, William E. Byrd, Daniel P. Friedman, and Chung-chieh Shan. FLOPS 2008. (source code)

We present decidable logic programs for addition, multiplication, division with remainder, exponentiation, and logarithm with remainder over the unbounded domain of natural numbers. Our predicates represent relations without mode restrictions or annotations. They are fully decidable under the common, DFS-like, SLD resolution strategy of Prolog or under an interleaving refinement of DFS...

[The] attempts to define decidable multiplication even for the seemingly trivial unary case show the difficulties that become more pronounced as we move to binary arithmetic. We rely on a finite representation of infinite domains, precise instantiatedness analysis, and reasoning about SLD using search trees.

So you've read The Reasoned Schemer and were excited about the fact that unlike the built-in operations in Prolog, arithmetic relations (over binary numbers) were fully implemented. For example, addition could also be used for subtraction and multiplication for factoring numbers and for generating all triples of numbers related by multiplication. Now comes this paper to explain the motivation behind some of the more arcane definitions needed to implement arithmetic in a fully relational style, and to prove their properties formally. The paper develops unary and binary arithmetic relations in pure Prolog (with no cuts, negation or introspection).

LtU readers will also be interested in yet another embedding of pure Prolog into Haskell, that the authors offer. It is not meant to be the most optimal or convenient Prolog implementation (it wasn't even intended to be an implementation of a logic system). It was explicitly designed to be easier to reason about and so help prove certain properties of SLD or similar evaluation strategies. The main difference of DefinitionTree from other embeddings of Prolog in Haskell has to do with the generation of fresh names for logic variables. In DefinitionTree, name generation is not an effect, and the naming is fully decoupled from the evaluation. The evaluation no longer needs to carry a state for the generation of fresh names, hence the evaluator is easier to reason about equationally.

NEXCEL, a Deductive Spreadsheet

NEXCEL, a Deductive Spreadsheet, Iliano Cervesato. 2006.

Usability and usefulness have made the spreadsheet one of the most successful computing applications of all times: millions rely on it every day for anything from typing grocery lists to developing multimillion dollar budgets. One thing spreadsheets are not very good at is manipulating symbolic data and helping users make decisions based on them. By tapping into recent research in Logic Programming, Databases and Cognitive Psychology, we propose a deductive extension to the spreadsheet paradigm which addresses precisely this issue. The accompanying tool, which we call NEXCEL, is intended as an automated assistant for the daily reasoning and decision-making needs of computer users, in the same way as a spreadsheet application such as Microsoft Excel assists them every day with calculations simple and complex. Users without formal training in Logic or even Computer Science can interactively define logical rules in the same simple way as they define formulas in Excel. NEXCEL immediately evaluates these rules thereby returning lists of values that satisfy them, again just like with numerical formulas. The deductive component is seamlessly integrated into the traditional spreadsheet so that a user not only still has access to the usual functionalities, but is able to use them as part of the logical inference and, dually, to embed deductive steps in a numerical calculation.

This is a neat paper about using Datalog-style relations to extend spreadsheets with some deductive database features. It seems like Datalog represents a real sweet spot in the design space for logic programming -- I've seen a lot of people put it to effective use.

The End of an Architectural Era (It’s Time for a Complete Rewrite)

The End of an Architectural Era (It’s Time for a Complete Rewrite). Michael Stonebraker, Samuel Madden, Daniel J. Abadi, Stavros Harizopoulos, Nabil Hachem, Pat Helland. VLDB 2007.

A not directly PL-related paper about a new database architecture, but the authors provide some interesting and possibly controversial perspectives:

  • They split the application into per-core, single-threaded instances without any communication between them.
  • Instead of using SQL from an external (web app) process to communicate with the database, they envision embedding Ruby on Rails directly into the database.
  • They state that most database warehouse tasks rely on pre-canned queries only, so there is no need for ad-hoc querying.

The somewhat performance-focused abstract:

In two previous papers some of us predicted the end of "one size fits all" as a commercial relational DBMS paradigm. These papers presented reasons and experimental evidence that showed that the major RDBMS vendors can be outperformed by 1-2 orders of magnitude by specialized engines in the data warehouse, stream processing, text, and scientific data base markets.

Assuming that specialized engines dominate these markets over time, the current relational DBMS code lines will be left with the business data processing (OLTP) market and hybrid markets where more than one kind of capability is required. In this paper we show that current RDBMSs can be beaten by nearly two orders of magnitude in the OLTP market as well. The experimental evidence comes from comparing a new OLTP prototype, H-Store, which we have built at M.I.T. to one of the popular RDBMSs on the standard transactional benchmark, TPC-C.

We conclude that the current RDBMS code lines, while attempting to be a "one size fits all" solution, in fact, excel at nothing. Hence, they are 25 year old legacy code lines that should be retired in favor of a collection of "from scratch" specialized engines. The DBMS vendors (and the research community) should start with a clean sheet of paper and design systems for tomorrow's requirements, not continue to push code lines and architectures designed for the yesterday's requirements.

A critical comment by Amazon's CTO, Werner Vogels.

Escape from Zurg: An Exercise in Logic Programming

Escape from Zurg: An Exercise in Logic Programming by Martin Erwig. Journal of Functional Programming, Vol. 14, No. 3, 253-261, 2004

In this article we will illustrate with an example that modern functional programming languages like Haskell can be used effectively for programming search problems, in contrast to the widespread belief that Prolog is much better suited for tasks like these.

...

The example that we want to consider is a homework problem that we have given in a graduate level course on programming languages. The problem was one of several exercises to practice programming in Prolog. After observing that many students had problems manipulating term structures in Prolog (after already having learned to use data types in Haskell) and spending a lot of time on debugging, the question arose whether it would be as difficult to develop a solution for this problem in Haskell. This programming exercise worked well, and we report the result in this paper.

(Haskell source code)

Theory and Practice of Constraint Handling Rules

Theory and Practice of Constraint Handling Rules, Thom Fruewirth, Journal of Logic Programming, 1994.

Constraint Handling Rules (CHR) are our proposal to allow more flexibility and application-oriented customization of constraint systems. CHR are a declarative language extension especially designed for writing user-defined constraints. CHR are essentially a committed-choice language consisting of multi-headed guarded rules that rewrite constraints into simpler ones until they are solved.

In this broad survey we cover all aspects of CHR as they currently present themselves. Going from theory to practice, we will define the syntax and semantics of CHR, introduce an important decidable property, confluence, of CHR programs and define a tight integration of CHR with constraint logic programming languages. This survey then describes implementations of the language before we review several constraint solvers -- both traditional and non-standard ones -- written in the CHR language. Finally we introduce two innovative applications that benefited from being written in CHR.

In the last post, we had some requests for constraint programming, so here you go. Constraint solving programs are often essentially stateful algorithms, and I see CHR as a particularly nice way of handling all that state in a declarative way. (They have a very pretty semantics as proof search in linear logic, too.)

Cloning-Based Context-Sensitive Pointer Alias Analysis Using Binary Decision Diagrams

Cloning-Based Context-Sensitive Pointer Alias Analysis Using Binary Decision Diagrams, John Whaley and Monica S. Lam. PLDI 2004.

This paper presents the first scalable context-sensitive, inclusion-based pointer alias analysis for Java programs. Our approach to context sensitivity is to create a clone of a method for every context of interest, and run a context-insensitive algorithm over the ex-panded call graph to get context-sensitive results. For precision, we generate a clone for every acyclic path through a program's call graph, treating methods in a strongly connected component as a single node. Normally, this formulation is hopelessly intractable as a call graph often has 10^14 acyclic paths or more. We show that these exponential relations can be computed efficiently using binary decision diagrams (BDDs). Key to the scalability of the technique is a context numbering scheme that exposes the commonalities across contexts. We applied our algorithm to the most popular applications available on Sourceforge, and found that the largest programs, with hundreds of thousands of Java bytecodes, can be analyzed in under 20 minutes.

This paper shows that pointer analysis, and many other queries and algorithms, can be described succinctly and declaratively using Datalog, a logic programming language. We have developed a system called bddbddb that automatically translates Datalog programs into highly efficient BDD implementations. We used this approach to develop a variety of context-sensitive algorithms including side effect analysis, type analysis, and escape analysis.

Binary decision diagrams are one of the coolest data structures of the last 20 years, and are also one of the basic enabling data structures underlying modern SAT solvers. Using them to implement Datalog is a really clever idea.

EDIT: I relied on memory instead of Google, and got it wrong about BDDs and SAT solving. Modern DPLL-based SAT solvers generally do not use BDDs, because when your solutions are far apart in the solution space the representing BDD would get quite large. BDDs are widely used in verification, though, and are also still one my favorite data structures. :)

CLL: A Concurrent Language Built from Logical Principles

CLL: A Concurrent Language Built from Logical Principles by Deepak Garg, 2005.
In this report, we use both the Curry-Howard isomorphism and proof-search to design a concurrent programming language from logical principles. ... Our underlying logic is a first-order intuitionistic linear logic where all right synchronous connectives are restricted to a monad.
Yet another example of using monads to embed effectful computations into a pure FP. Another interesting part is the methodology of derivation of a PL from a logic.
XML feed