The Terms language

I have been working on a logic programming language
that I have published here [1].
In my opinion, it has a potential expressive power
superior to that of other logic languages I have studied [2],
it can reason efficiently (polynomial time),
and it is mathematically very simple (first order theory with a finite model).

I guess I must be wrong, but where am I wrong?

1.- http://pypi.python.org/pypi/Terms/
2.- prolog, CLIPS (+ COOL), the semantic web OWLs, and various offshots of those

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Loves is Exists

I find this non-reflexive use of `is` a bit awkward. Why not "loves is a verb"?

Regarding expressiveness, do you think you could demonstrate a few more challenging programs: shortest routes in a graph, traveling salesman, sorting a list?

Can you express that john likes a particular traveling salesman problem?

in reply to Loves is Exists

> Loves is Exists ***

> I find this non-reflexive use of `is` a bit awkward. Why not "loves is a verb"?

Yes I agree it is awkward. "loves is a verb" is actually valid. For consistency it should be "a loves is a exists", but I found that awkward too, I think I was mistaken. In any case, defining verbs by subtyping other verbs gives more structure than defining all verbs by typing "verb".

> Regarding expressiveness, do you think you could demonstrate a few more challenging programs: shortest routes in a graph, traveling salesman, sorting a list?

My understanding is that those problems are not hard to express, they are hard to solve; and that that hardness is in the work the machine has to do to solve it, rather that in the work the programmer has to do to instruct the machine. There shouldn't be much difficulty in producing an ontology where we can speak of points in a 2D plane, distances, sequences of points,
lengths of those sequences, and comparisons of those lengths. Solving the problem should not be difficult either, but probably not very efficient. I would use one of the known algorithms for it, implemented as a set of rules, in much the same way as I would in prolog.

In any case the problems that complexity theory is interested in are not the problems Terms is devised to solve. Terms is directly concerned with expressivity. A way of measuring what I mean with expressivity would be the how much of the natural language description of a problem or situation can be directly and simply expressed in the language.
E.g., if you are taking the requirements for a software project, how much of those requirements can be directly and simply translated to code: how similar are the expressions of the requirements in natural language and in code.

In general, the logic languages I see have a common limit in their expressivity, given by the way they tend to represent natural verbs. They cannot quantify over them, or they need special variables, etc. This is the limit first established by Bertrand Russell with his paradox over Frege's work.

What I find is that if we limit the natural verbs that are represented as logical predicates to just the copular verbs, and represent all the rest as individuals, we gain a lot of expressivity. One manifestation of this would be how we can use verb or fact variables in Terms. I don't see how to do that in other systems.

> Can you express that john likes a particular traveling salesman problem?

We could, if we take into account the above; if we understand that the expression of "a particular traveling salesman problem" would not carry any solution to it.

You speak generally of

You speak generally of expressiveness. I am curious what definition you are using for the word; I happen to favor Matthias Felleisen's definition of the word from On the Expressive Power of Programming Languages. This describes expressiveness on a lattice, as a partial order between languages, with regards to locality of expression.

I have worked with many logic languages, even implemented a few. They are not difficult - at least if you aren't too concerned with consistency (via termination guarantees), performance, or scalability. Of those I've studied, my recent favorites are probably Dyna (weighted logic), Dedalus (temporal logic), and Coq (constructive logic).

You mentioned of Terms language that "it can reason efficiently (polynomial time)". I am curious what you mean by this, e.g. in context of the traveling salesman problem.

About expressiveness, I

About expressiveness, I think we mean the same. I have been reading the wikipedia article on "expressive power" and I concurr, and Matthias Felleisen's work is mentioned there as support.

About polynomial time, I mean that, given a knowledge base and a new sentence, the cost of integrating the new sentence depends very little on the size of the knowledge base, and directly on the amount of consecuences. Simply put, that I am using a RETE network for the ruleset and also an eficient structrure for querying the factset. I was not talking about solving any possible problem with any given efficiency. I only wanted to convey that Terms is based on a very simple logic, that can be solved efficiently, where "solved efficiently" means "we have a set of rules and a set of facts, and we want to add a new one and all the derived consecuences, and take the same time (given generous spatial resources) independently of the number of rules or facts".

In general, the logic

In general, the logic languages I see have a common limit in their expressivity, given by the way they tend to represent natural verbs. They cannot quantify over them, or they need special variables, etc. This is the limit first established by Bertrand Russell with his paradox over Frege's work.

Do you mean the quantification of predicates or Russel's paradox?

With regards to quantification, can we not just use call or apply in Prolog. This comment comes from the Edinburgh Prolog parser circa 1984:

I have added the rule X(...) -> apply(X,[...]) for Alan Mycroft.

and here we we see this gives us quantification of predicates.

Now Russel's paradox is "Let w be the predicate: to be a predicate that cannot be predicated of itself. Can w be predicated of itself? From each answer the opposite follows." How can Terms manage to deal with this?

[edit: changed "How can any language manage" to "How can Terms manage"]

I mean quantification of

I mean quantification of predicates ;)

> With regards to quantification, can we not just use call or apply in Prolog.

That is, if I am not mistaken, second order, and what I propose is first order.

Concerning Russell's paradox, Terms does not deal with it, it is not possible to quantify over predicates in Terms. But only the copular verbs ("is of type", "is subtype of") are predicates in Terms. The facts provided in Terms, actually have the implicit form:

(subject verbx object) is a exists.

With this, we just have an operator "()" that allows us to provide a tuple as an individual, in an otherwise simple first order set theoretic sentence. And "subject", "verbx", and "object" are common individuals... The point is that the pattern in parenthesis matches very conveniently the natural sentences. With this scheme we loose the ability of quantifying over the copular verbs, but they are just 2 -or 3- so it doesn't feel so necessary.

Expressive power and negation.

Can you define what you mean with expressive power in a bit more detail? What things can you express in Terms that cannot be done in Prolog, where is Term's advantage?

Here's some simple Prolog code in what I consider to be a logical programming style using negation. How would we write this in Terms?

%%%
%%% maximal/2 
%%% maximal(?Element, +List)
%%%
%%% Element is a maximal element of List if Element is in List and 
%%% there is no X in List distinct from Element such that 
%%% X =< Element.
%%%
maximal(Element, List) :- 
    member(Element, List), 
    \+ (member(X, List), 
        X =\= Element, 
        Element =< X).

> Can you define what you

> Can you define what you mean with expressive power in a bit more detail?

Naïvely: I would start with collecting a number of natural language patterns, such as "a _ is a _". I would try to provide a set with which to be able to specify clearly and without ambiguity the requirements of any software project. And then I would classify the expressivity of mechanical languages according to the subsets of those patterns that they adecuately cover.

> What things can you express in Terms that cannot be done in Prolog, where is Term's advantage?

Nothing; Terms might very easily be expressed in Prolog. The previous incarntion of Terms was implemented in CLIPS. Where Prolog would be first order logic, Terms would be a first order theory.

What I believe is the advantage of Terms is the way it relates to the natural language. It is a very simple structure, but it can be taken as a model for a sizeable part of the natural language.

> How would we write this in Terms?

That's a trycky one. There is in Terms the foundation for the implementation of this. This would involve operations over the database, such as getting -in a rule- the number of sentences matching a given pattern, or the person who has been the most times to the pole. I'm not sure about the complexity of this.

At one point, I say: "the

At one point, I say: "the logic languages I see have a common limit in their expressivity...". Later on, I say "Nothing; Terms might very easily be expressed in Prolog".

That is somewhat contradictory. In the first case I was referring to the use I see of those languages, and in the second to the actual language.

An example of what I mean with "the use of those languages" can be seen in the case of the semantic web. The semantic web tries to draw a correspondence between the natural language and its constructs. It translates the natural verbs to predicates in the PL. And it ends up with a flavor -OWL Full- that, whereas it allows to "treat classes as individuals", it warns that it "is undecidable, so no reasoning software is able to perform complete reasoning for it." In OWL there is unrestricted comprehension (anonymous classes defined by predicates), and therefore classes correspond to predicates, and it represents verbs as predicates/classes, and it wants to do things with the verbs that it cannot do with the predicates/classes: the limit

Expressive power, one last time

What I mean with expressive power is not formalizable,
since it depends on a comparison with the natural language.
I guess "expressive power" is not an afortunate terminology.
It is clear that what I proposed "naïvely" is undoable;
Building that "set of patterns of natural language sentences"
would be akin to starting Logic all over again.
That is why I called it naïve.

About the application of that concept to logic languages,
it was not meant in an absolute way.
"Expressive power", in the manner I have used it,
is, in general, meaningless.
It is supposed to be a quantity, yet it has no measure,
not even a method of ordering.

In particular cases, however, I think it can be used.
There are fragments of the natural languages that can be formalized
(in the sense that we can formalize the way we naturally talk about sets),
and we can discern whether a particular logic language includes those formal structures.
I don't mean this as a plan to apply a coherent set of expressivity tests to place a language in some scale;
I mean this with the intent of discussing particular aspects of the language,
its adecuacy for some task. "If we have a project with requirements that talk
about sets of things, what language can we use?"

In this sense, I think that the distinction between a logic language and its use
is somewhat diluted, since its adecuacy for some task depends on how we can use it.

To finish, I will try to show what "expressive power" problem I am trying to solve in Terms,
and I will do it by analyzing a natural sentence: "if she wants something, she gets it".
It is a simple rule. We can formalize the natural use of rules, it is modus ponens.
So far, if we have a project that requires the system to include that sentence in its semantics,
we might consider using a logic language, say Prolog.
"She" can be an atom with no problem.
"gets", and "wants", can be adecuately predicates.
Now we see that "something", and "it", behave as 2 appearances of a same logical variable.
The problem is that she may want an apple, or to go to the cinema, or a verb.
And logical variables cannot range over all of atoms, predicates, and facts.
So we conclude that we cannot use Prolog, or we have to use it in some clever ad-hoc manner.
What I propose pretends to be a general way of using Prolog, or any other language, for that task.