Uniform Proofs as a Foundation for Logic Programming

Uniform Proofs as a Foundation for Logic Programming (Dale Miller, Gopalan Nadathur, Frank Pfenning & Andre Scedrov 1989/1991) introduces the concept of unirform provability, which is fundamental to abstract logic programming, which in turn drives much of the more principled work in modern logic programming.

In particular, this paper (i) shows that logics that support uniform provability support goal-directed proof search, and so may be used as the foundation of a logic programming language, and (ii) the notion of uniform provability may be used to explore new forms of logic programming, in particular introducing the class of hereditary Harrop formulae which underlies lambda-Prolog. A classic...

Abstract logic programming is expored in a more introductory manner in this paper.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

If you find this paper

If you find this paper interesting, you'll also like Andreoli's paper on focused proofs in linear logic, which allow an even more agressive reduction in nondeterminism. It works for intuitionistic logic, too.

Not much interest

I find it interesting that there isn't much interest in logic programming anymore from the programming community. I've encountered very few professional uses of logic programming.

There have been many advances in the field of logic programming since prolog, and it seems (maybe just from my perspective) that there is renewed interest in the field in the last 5 or so years.

Does anyone imagine a logic programming language competing on the same playing field as php python ruby and perl?

Gavin

Take one

The idea of backtracking and rules is probably too novel to replace traditional methods in general purpose programming. This is easy to understand on many levels. Basically the perceived advantage is not enough to justify such a big change. This is why backtracking control and logic is often hidden in a conventional style as in Icon or Oz. Forward chaining is much more popular but enters the main stream in disguise. I think erlang is an example. Agent thinking in general seems to be an example. But these are just impressions. Reasonable people are allowed to disagree.

Resource predictability

I don't think the problem with logic programming has been novelty or difficulty of the concepts, in fact many people have investigated Prolog at one point or another. The main obstacle to widespread adoption, I think, has been the difficulty of consistently writing in a logic programming language in a manner that isn't subject to search space explosion.

Concurrent logic programming has made this easier, though I don't think the problem is solved. When the problem is solved, I predict that we will see logic programming languages competing with languages such as ruby for general purpose scripting.

Why logic programming is not more widespread

the difficulty of consistently writing in a logic programming language in a manner that isn't subject to search space explosion

Once you get beyond the hump of writing toy programs, this is not difficult at all. There are plenty of good books that explain how to do it. For example, see Richard O'Keefe's book The Craft of Prolog or Sterling and Shapiro's The Art of Prolog.

Concurrent logic programming has made this easier

Concurrent logic programming has no search. Modern logic programming systems combine both styles of logic programming, which gives the best of both worlds. Mozart is one such system (which implements the Oz language); other good examples are CIAO and Curry, which are also full-fledged programming systems.

Logic programming is a very different way of thinking than imperative programming. To do it well, you have to think at two levels. You reason about correctness at the logical level and you reason about efficiency at the operational level. This works very well as a separation of concerns. Kowalski's famous equation Algorithm = Logic + Control can be read this way. This is a profound example of Aspect-Oriented Programming, twenty years before the term was invented. But most programmers don't know much about logic. I am more and more convinced that the real problem is with computer science education. Programmers do not get enough discrete mathematics before being let loose on a computer. In what engineering field do you have graduates who do not know the mathematics underlying the field? Only in computer science!

not easy to use

I think part of the problem is that logic programming isn't explained as well as even functional programming. Haskell is still a strange language for majority of mainstream developers, but there are quite a few tutorials for it on the web. There are several books which explain not just how to program and how features from functional programming are implemented. For example, there are countless blogs entries about exactly what a closure or continuation means...how to use it (fake it) in java/python/javascript, etc. I still don't completely understand, but after reading about the same topic five or six times, one eventually understands at least the basics.

I can only say why I'm interested. I think it was Guy Steele at an MIT scripting language conference (I can't find the video link now) who mentioned that people should look at relational programming. Such a recommendation can't be ignored. I have a great deal of interest in databases (millions of other java/VB developers do as well). As soon as one realizes that there is a link between what prolog does and what databases do, it is inevitable there huge number of people will take interest in logic programming (unfortunately that link isn't often made).

It may be difficult for beginners to formulate complete programs in a logic programming language, but I've often wondered why there aren't little utilities (as ubiquitous as GREP or calc.exe) which allow people to solve little problems quickly using logic.

Lastly, I'll link to a question I posted recently :).

Use cases?

It may be difficult for beginners to formulate complete programs in a logic programming language, but I've often wondered why there aren't little utilities (as ubiquitous as GREP or calc.exe) which allow people to solve little problems quickly using logic.

What kind of problems would you imagine such a utility solving? Text files and simple math problems are ubiquitous, and the user interface pretty simple for both. I just can't think of any day to day uses for a little logic solver, or even what that tool would look like.

if you build it they will come

I don't have specific use cases. But if there are problems a computer can solve easily, why not allow that functionality? I never thought I'd use any of the unix utilities until the minute I needed them. Perhaps people who are well versed in logic programming can offer appropriate use cases.

Use cases

Parsing can be much simpler in a declarative setting. Specifically Declarative Clause Grammars (DCGs) are very easy to write and read. They are however usually quite difficult to make efficient, and most systems don't have enough determinism detection, tabling or whatever else is needed to make things tractable. In the face of good ways of solving these problems I would think that DCGs could replace the more complicated regexps completely.

Another example that in fact I ran into today while programming is the creation of equivelence classes. I ended up writing my own unification.

The example of Databases is also a good one. The declarative relational approach has proven to be very robust and is probably the most widely used solution to persistence. A more expressive database system (that could express for instance DataLog) could perhaps be popular. There are even ways of dealing with updates in a declarative fashion (see for instance Bonner, Kifer 93: Transaction Logic).

you mean like regular

you mean like regexs? or rules as they are called in perl 6.