## Multimap unification

I strongly believe that prolog-like unification (programming with holes) should be incorporated in every new programming language. One of my arguments is that unification is more general than pattern matching, and that unification is just as natural as pattern matching.

In contrast, I find the unification of two sets hardly intuitive. Set unification is not only non-intuitive but also NP complete. Still, I believe that set unification, as a first class operation, is an interesting programming construct.

Actually, I'm more interested in the unification of multimaps because they extend sets. My intuition also tells me that multimap unification can be made more efficient in most use-cases.

I'm not sure however. Are there any unification 'theorems' that I should consider that counter my intuition?

## Comment viewing options

### Examples?

I'm having trouble understanding what you are talking about since unification is a syntactical notion. Can you give some examples where you unify two sets?

Is this little Prolog program something like what you are thinking about?

set_unification([], []).
set_unification(S1, S2) :-
del(X, S1, S1prime),
del(X, S2, S2prime),
set_unification(S1prime, S2prime).

I can use it to calculate B=1 when given the "sets" {1,2,3} and {3,2,B}.

| ?- S1 = [1,2,3], S2 = [3,2,B], set_unification(S1, S2).
S1 = [1,2,3]
S2 = [3,2,1]
B = 1

yes

### Nested sets

Your example is easy to follow. Here is another example that is less intuitive:

| ?- S1 = [2,A,B], S2 = [3,C,D], set_unification(S1, S2).

Not considering all permutations (of variables) we get the following result:

S1 = [2,3,X]
S2 = [2,3,X]
A=3
B=X
C=2
D=X

.. which I find not immediately easy to understand.

Things get way more complicated when we allow nested sets, i.e.

a_unify(A,B) :-
is_list(A),
is_list(B),
nested_set_unification(A,B).
a_unify(A,B) :- A = B.
nested_set_unification([], []).
nested_set_unification(S1, S2) :-
select(X, S1, S1prime),
select(Y, S2, S2prime),
a_unify(X,Y),
nested_set_unification(S1prime, S2prime).

| ?- S1 = [[1,X],[2,Y],S], S2 = [[1,A],[3,A],[3,4]], nested_set_unification(S1, S2).

A = 2
S = [3,4]
S1 = [[1,2],[2,3],[3,4]]
S2 = [[1,2],[3,2],[3,4]]
X = 2
Y = 3

I don't know about you, but the above example makes my head hurt. I can't think of more elaborate examples but I hope I made my point clear.

### I strongly believe that

I strongly believe that prolog-like unification (programming with holes) should be incorporated in every new programming language.

I believe that prolog-like unification should not be a ubiquitous run-time semantics in a programming language. It should be restricted to compile-time use unless explicitly invoked, as it makes program reasoning more difficult. Do you have any arguments you'd be willing to share?

### Declarative expressiveness

Program reasoning might be become harder, but we gain more declarative expressiveness.
Examples of more expressive systems are term rewriting systems such as Maude, or Constraint Logic Programming (CLP) that can be found in many Prologs.

The paper 'Programming with Partially Specified Aggregates in Java' really resonates with my beliefs, because it discusses constraint-solving and the unification (of sets!) using imperative java.

### Unification has undesirable

Unification has undesirable properties when used in open concurrent or distributed systems. I believe that unification should not be ubiquitous in a general purpose programming language. Difficult properties ought to be obvious to the user, e.g. via modeling of unification within the language and type system.

### Undesirable properties

Are those undesirable properties you mention not all related to lazy evaluation?
I'm a proponent of total (strict) functional programming, and that doesn't rule out ubiquitous unification.

### Lazy evaluation was not

Lazy evaluation was not assumed. But I do assume a general purpose programming language with unification will support some forms of open, concurrent, or distributed unification - e.g. as expressed naturally in several languages with built-in unification, including concurrent variations of Prolog, and Oz/Mozart. (By open I mean plugins, scripts, mashups, untrusted or mutually distrustful modules or components.)

Your being a proponent of total functional programming doesn't seem relevant. Your OP claim was for "every new programming language". That would include programmatic control of effects, would it not? An issue for any general purpose programming language is feature interaction, including interaction of features (such as unification) with effects. Ignore feature interactions at your potential users' peril.

### Pattern matching

Pattern matching is a feature that is found in many modern programming languages. And yes, pattern matching interacts with other features, such as programmatic control of effects (mutability). But that's true for any feature.
Considering that unification is more powerful pattern matching, why shouldn't we replace pattern matching with unification?

### Considering that unification

Considering that unification is more powerful pattern matching, why shouldn't we replace pattern matching with unification?

Unification is more expressively powerful than pattern matching, but it is not correct to therefore call it "more powerful pattern matching". There are differences in how unification is implemented and integrated. Reasoning about unification is much less localized than reasoning about pattern matching (where developer must syntactically bring pattern and object together). These other properties may be cause to reject unification despite its power.

Besides, I have also begun to consider pattern matching a questionable PL feature. It doesn't generalize well - e.g. matching on a subgraph can be an NP-hard problem, or matching on a distributed value would generally require implicit communication between partitions. Pattern matching binds code to a concrete data structure, hindering reuse and extension (a more general alternative is to always use type-classes to extract information from a value). Pattern matching is a poor fit for many kinds of data - codata (generators, streams), linear types, etc.. Pattern matching is syntactically voluminous and difficult to compose compared to lenses and arrows or FORTH-like languages.

Pattern matching has become popular, yet it may be hindering developers in ways they do not readily recognize. I do not believe even that pattern matching "should be incorporated in every new programming language". Do you?

### Unification is more powerful than pattern matching

I wanted to say 'considering that unification is more powerful than pattern matching'.

Anyway, I hope you agree with me that pattern matching is way better than nested if-then-else. Having said that, multi-methods are also way better than if-then-else. However, if I have a choice, I would prefer pattern matching to multi methods, because pattern matching is more explicit.

I do agree with your statement that 'pattern matching binds code to a concrete data structure, hindering reuse and extension'. This is true when it is used directly to destruct and match concrete data structures.

A better approach is to pattern match on generalised views or derivations (of concrete data structures). One may choose to (lazily) built such generalised views through lenses.

So I do believe that pattern matching "should be incorporated in every new programming language".

### Pattern Calculus

If you extend patterns to allow pattern variables, the resulting calculus as described by Barry Jay in "Pattern Calculus" is no longer a PL feature but the whole basis. It subsumes lambda abstraction for example.

### Unification calculus?

Many thanks for the pointer to pattern calculus. It shows that 'polymorphic' pattern matching is feasible. To have patterns as first class is very powerful indeed. I wonder if pattern calculus can be extended to 'unification calculus'? Prolog certainly looks like a contender.

### Pattern matching with Strucjure

Just found this: Structjure is a clojure library based on OMETA (with some interesting additions).