Compositional let bindings

I have been working on compositional let bindings, and wanted to get some comments on what I have so far. Each program fragment is typed with a monomorphic input context (the free variable requirements for the fragment) and an output polymorphic context (the definitions exported from the fragment). Lambda abstraction works as before, removing a variable from the input-context of the rhs. Let is more interesting, it adds the defined variable to the output-context, but we also want it to be usable in expressions. My first attempt is to have the let binding act as the identity function, so that apply does the necessary substitutions (symmetrically) from definitions in one fragment to requirements in the other. This is what a typing derivation looks like:

(x = \z . (z, z)) (x 1, x true)

1. [var]		{z : a} |- z : a
2. [var]		{z : a} |- z : a
3. [prd (1) (2)]	{z : a, z : b} |- (z, z) : (a * b)
4. [abs z (3)]		|- (\z . (z, z)) : (a -> (a * a))
5. [let x (4)]		|- {|- x : (a -> (a * a))} (x = (\z . (z, z))) : (b -> b)
6. [var]		{x : a} |- x : a
7. [lit]		|- 1 : Int
8. [app (6) (7)]	{x : (Int -> a)} |- (x 1) : a
9. [var]		{x : a} |- x : a
10. [var]		|- true : Bool
11. [app (9) (10)]	{x : (Bool -> a)} |- (x true) : a
12. [prd (8) (11)]	{x : (Int -> a), x : (Bool -> b)} |- ((x 1), (x true)) : (a * b)
13. [app (5) (12)]	|- {|- x : (a -> (a * a))} ((x = (\z . (z, z))) ((x 1), (x true))) : ((Int * Int) * (Bool * Bool))

Does this approach seem reasonable? It seems that I can implement all sorts of weird scoping rules, as the definitions compose upwards, which is probably not desirable but can be fixed by clearing the polymorphic output context where appropriate.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

I don't understand how you got from 5., 12. to 13.

How come you can 'suddenly' unify the result types? I would expect:

5. [let x (4)]		|- {|- x : (a -> (a * a))} (x = (\z . (z, z))) : (b -> b)
12. [prd (8) (11)]	{x : (Int -> a), x : (Bool -> b)} |- ((x 1), (x true)) : (a * b)
13. [app (5) (12)]	|- {|- x : (a -> (a * a))} ((x = (\z . (z, z))) ((x 1), (x true))) : ((a * a) * (b * b))


There doesn't seem to be any information in the typing of 5. and 12. that you could have derived the typing of your 13. (And shouldn't you have dropped the typing of 'x' from the context in 13.?)

And why not use the default let notation or write a semicolon?

Want let to be compositional and usable in expressions.

If we want let to be usable in expressions it must have a type. For example, what would:

\x . let y = \z . z; 

Mean?

Why not use the default let? I want it to be compositional, so you can write just:

(let x = \z . z) 

So you can have definitions, and these fragments can be composed with those that use them.

(5) Contains the polymorphic definition "|- x : a -> (a * a)" which is generalised and unified with each of the requirements in (12) "x : Int -> a" and "x : Bool -> b". The type variables in the monomorphic environment are the same as the ones in the type, so a becomes (Int , Int) and b becomes (Bool, Bool) in the type (a * b). The type variables with the same letter are the same variable in each line. Lines are independent of each other. There is implicit universal quantification at the beginning of each line.

As for removing "x" from the polymorphic context, probably it should be. In a let block mutually recursive definitions require propagation both forwards and backwards, so to get that behavior the x might be dropped at the next let or abstraction instead.

Having said all that, composable definitions might make sense only at the top level, in which case they could be statements, and lets inside other things could have the normal structure... probably a better way of doing it. So we would have

x = \z . (z, z); 
y = (x 1, x true);

and it would be treated as a statement.

I don't think you can do this. Mixed mono- and polytypes

The type checker I once wrote generalizes monomorphic type variables as they are locally bound by a let definition. A type in an LC with let can hold both monomorphic as well as polymorphic variables as it is constructed bottom up.

I'll see, after the next football match, whether I can cook up an example with both monomorphic and polymorphic types. Maybe someone else?

Which bit?

I'm not sure which part you think is problematic. There is no problem with taking the left hand part of the "let x = ..." and adding the generalised type to a polymorphic environment. Perhaps what you are missing is that the entries in that polymorphic environment have their own mono-morphic environment (this is the equivalent of a type-scheme). Its not clear in the above example, but here is an example:

|- {{f : (a -> b), g : (a -> c)} |- x : (a -> (b * c))} (x = (\z . ((f z), (g z)))) : (d -> d)

Here the term "let x = \z . (f z, g z)" would put the following definition into the polymorphic environment "{f : (a -> b), g : (a -> c)} |- x : (a -> (b * c))". If you then apply that definition to "(x 1, x true)" then you would get two different versions of "f" and "g" added to the monomorphic context. So that the type of "(let x = \z . (f z, g z)) (x 1, x true)" would be:

{f : (Int -> a), f : (Bool -> b), g : (Int -> c), g : (Bool -> d)}
|- {{f : (e -> f), g : (e -> g)} |- x : (e -> (f * g))}
((x = (\z . ((f z), (g z)))) ((x 1), (x true))) : ((a * c) * (b * d))

I didn't read it very carefully

My hunch, at the moment, is that you'ld need a rule like 'Every type variable in the polymorphic context is a monomorphic variable if, and only if, it also occurs in the monomorphic context."

That is, you've split between a mono- and polymorphic context, right?

They cannot be the same.

Each typing (mono-morphic context and associated type) is always instantiated with fresh variables, like a Prolog clause. So there can never be a type-variable shared between the polymorphic context and the monomorphic context. Entries in a monomorphic context are types. Entries in the polymorphic context are typings (that is pairs of monomorphic context and type).

forall a. a -> b

The above intermediate type is one which should naturally arise during the bottom-up procedure of HM type checking when variable a was bound to a local variable and b to a global variable such that only the variable a could be generalized.

Since you claim to be equivalent to HM the same scenario should occur, and handled, in your type scheme.

Sorry, still watching football as I type this. And I don't know HM that well since I never took an interest much since it cannot handle bisimilar types. I'll give it a bit more thought for an LC example later.

First something simple then. How are you going to type the following?

\x. let y = x in y

Not Sure

I don't see where that occurs. in "let y = x" we simply have the polymorphic {x : a} |- y : a, which at that stage would allow both:

let x = id in let y = x in y

or

\x . let y = x in y

Example 1 of Wikipedia HM page

An example with mixed mono- and polymorphic variables is given on the wikipedia page on Hindley-Milner.

let bar = \x. 
    let foo = \y. x
    in foo
in bar

The intermediate type of foo is forall b. b -> a since the type introduced by x cannot be generalized until the introduction of bar where x is out of scope.

monomorphic typings


(let bar = \x . (let foo = \y . x) foo) bar

1. [var]		{x : a} |- x : a
2. [abs y (1)]		{x : a} |- (\y . x) : (b -> a)
3. [let foo (2)]	|- {{x : a} |- foo : (b -> a)} (foo = (\y . x)) : (c -> c)
4. [var]		{foo : a} |- foo : a
5. [app (3) (4)]	{x : a} |- {{x : b} |- foo : (c -> b)} ((foo = (\y . x)) foo) : (d -> a)
6. [abs x (5)]		|- {{x : a} |- foo : (b -> a)} (\x . ((foo = (\y . x)) foo)) : (c -> (d -> c))
7. [let bar (6)]	|- {|- bar : (a -> (b -> a)), {x : c} |- foo : (d -> c)} (bar = (\x . ((foo = (\y . x))
foo))) : (e -> e)
8. [var]		{bar : a} |- bar : a
9. [app (7) (8)]	|- {|- bar : (a -> (b -> a)), {x : c} |- foo : (d -> c)} ((bar = (\x . ((foo = (\y . x))
foo))) bar) : (e -> (f -> e))

In (3) you can see foo has the type "{x : a} |- foo : (b -> a)" which is equivalent to:

forall a b . {x : a} |- foo : b -> a

So when the 'x' on the left gets unified with something that constrains the 'a' on the right. 'b' is not constrained so we can transform that into:

forall a . {x : a} |- foo : forall b . b -> a

Then if you 'forget' the context to turn it into an HM style type:

forall b . b ->  a

Fact 3.

No idea why it would have type 'c -> c'. You just don't make sense.

Curried Let

Its because we have curried the definition of let. (but perhaps this should be a different arrow, as it has side-effects?)

Nonsensical

And why would you curry the let except for as a stop-gap solution to some other problem I pointed at?

Are you going to give the following a function type?

let y = 1 in y

forall a . a -> b

Okay, I see where this is relevant now, but I don't think its a problem. When we instantiate the polymorphic types, we only generalise the free variables in the polymorphic type respecting the monomorphic environment and type. You can see this here:

1. |- {{x : a} |- y : a} (y = x) : (b -> b)
2. |- {|- y : a} (\x . (y = x)) : (a -> (b -> b))

In typing (1) the definition of 'y' is polymorphic, however in (2), although it in the polymorphic environment the type variable assigned to 'y' is not free in the rest of the typing (it occurs in the type itself (a -> (b -> b)) so it will not be generalised.

Two Ways

Here is the original algorithm, which is sufficient for 'lets' nested inside other scopes. Basically they cannot be separated (for example in importing definitions) so this way is fine:

1. [var]		{x : a} |- x : a
2. [var]		{y : a} |- y : a
3. [let y (1) (2)]	{x : a} |- (let y = x in y) : a
4. [abs x (3)]		|- (\x . (let y = x in y)) : (a -> a)

Note, this does not show the polymorphic context, which would be dealt with only for statements. The experimental compositional version gives (and this gives some more insight into what is actually happening inside the 'let' rule above):

1. [var]		{x : a} |- x : a
2. [let y (1)]		|- {{x : a} |- y : a} (y = x) : (b -> b)
3. [abs x (2)]		|- {{x : a} |- y : a} (\x . (y = x)) : (b -> (c -> c))
4. [var]		{y : a} |- y : a
5. [app (3) (4)]	{x : a} |- {{x : b} |- y : b} ((\x . (y = x)) y) : (c -> c)

Which gives the same type. The important thing to notice is that they typings automatically give the property you were asking about. For example:

{x : a} |- y : a

Is a polymorphic definition of 'y', but it is monomorphic in itself. It is used by replacing all the type variables with fresh ones, and then unififying.

Essentially the difference between a polymorphic definition and a monomorphic definition is that a monomorphic one you freshen the variables once and use the same 'copy' in all the unifications with the different use sites, with a polymorphic one you freshen the variables each time before you unify with the use sites. If you consider the bidirectional nature of unification, you can see that this is exactly equivalent to the PolyVar and MonoVar rules in HM.

We can also look at:

1. let x = \z . z in let y = x in (y 1, y true)

2. (\x . let y = x in (y 1, y true)) (\z . z)

The first of which type checks, and the second does not.

Pass and Fail

Type inference succeeds when x is polymorphic:

(let x = \z . z) (let y = x) (y 1, y true)

1. [var]		{z : a} |- z : a
2. [abs z (1)]		|- (\z . z) : (a -> a)
3. [let x (2)]		|- {|- x : (a -> a)} (x = (\z . z)) : (b -> b)
4. [var]		{x : a} |- x : a
5. [let y (4)]		|- {{x : a} |- y : a} (y = x) : (b -> b)
6. [app (3) (5)]	|- {|- x : (a -> a), |- y : (b -> b)} ((x = (\z . z)) (y = x)) : (c -> c)
7. [var]		{y : a} |- y : a
8. [lit]		|- 1 : Int
9. [app (7) (8)]	{y : (Int -> a)} |- (y 1) : a
10. [var]		{y : a} |- y : a
11. [var]		|- true : Bool
12. [app (10) (11)]	{y : (Bool -> a)} |- (y true) : a
13. [prd (9) (12)]	{y : (Int -> a), y : (Bool -> b)} |- ((y 1), (y true)) : (a * b)
14. [app (6) (13)]	|- {|- x : (a -> a), |- y : (b -> b)} (((x = (\z . z)) (y = x)) ((y 1), (y true))) : (Int * Bool)

Type inference fails when x is monomorphic:

(\x . ((let y = x) (y 1, y true))) (\z . z)

1. [var]		{x : a} |- x : a
2. [let y (1)]		|- {{x : a} |- y : a} (y = x) : (b -> b)
3. [var]		{y : a} |- y : a
4. [lit]		|- 1 : Int
5. [app (3) (4)]	{y : (Int -> a)} |- (y 1) : a
6. [var]		{y : a} |- y : a
7. [var]		|- true : Bool
8. [app (6) (7)]	{y : (Bool -> a)} |- (y true) : a
9. [prd (5) (8)]	{y : (Int -> a), y : (Bool -> b)} |- ((y 1), (y true)) : (a * b)
10. [app (2) (9)]	{x : (Int -> a), x : (Bool -> b)} |- {{x : c} |- y : c} ((y = x) ((y 1), (y true))) : (a * b)
11. [abs x (10)]		|- (\x . ((y = x) ((y 1), (y true)))) : 

error unifying: Int Bool
: line 1, column 3.
(\x . ((let y = x) (y 1, y true))) (\z . z)
  ^------------------------------^


Fact 2.

I don't see how you can get from an equality between to expression variables to a function type for " y= x".

1. [var]         {x : a} |- x : a
2. [let y (1)]   |- {{x : a} |- y : a} (y = x) : (b -> b)

Typing let as a function

What if we try and type 'let' as a function, we end up with something like:

let :: Name -> rhs :: Expr -> body :: Expr -> Expr

What is does is take the name and rhs, and add it to the polymorphic environment, it then applies the substitution in the body.

So think of (let y = x) as a curried form of 'let'.

Doesn't make sense

No idea what you're doing. It would break your first example derivation. Neither do I get what the second supposed polymorphic context is supposed to do.

HM isn't that hard. If you can upgrade a monomorphic variable to a polymorphic one, you try that and signal the upgrade with a forall symbol. That makes that variable, for better or worse, reinstantiatable.

HM is presented as a logical system but from a PL perspective that's usually beside the point; it's just a unification scheme. You seem to toss random logical symbols around.

I've got no idea what you're doing.

Logical Symbols

Sure I wrote HM type inference first. It's not that I can't do HM, I am doing something better :-)

Really its not that hard, the monomorphic context are the requirements, and the polymorphic context the definifions. Think about:

y = x

"x" is free in this fragment, that means we need a definition of x from somewhere else. "y" is defined that means we can provide a definition of "y".

The logical symbols are not random, for example:

{y : a} |- x : a

relates directly to the Prolog:

x(A) :- y(A)

So a typing consists of a judgement with a monomorphic context and a type, nothing hard about that surely and a pretty standard use of the turnstile.

So when we extend that to polymorphic environments, where each symbol definition has a typing we end up with:

{{y : a} |- x : a, {z : a} |- w : a, ... }

So the only bit I am not sure about is when extending a typing with a polymorphic environment, whether it should be on the left hand side or the right?

Which is right:

{{y : a} |- x : a, {z : a} |- w : a, ... } {x : a, ... } |- E : a

or 

{x : a, ... } |- {{y : a} |- x : a, {z : a} |- w : a, ... } E : a

What might be confusing you is that I am omitting the braces if the context is empty. Would it be more readable if I left in the empty contexts "{}"?

See the Prolog program below (the complete program with associated definitions is in the Rank 0 Intersection Typing thread) for a formal definition of what I am doing with the monomorphic and polymorphic contexts. Perhaps they could be the same, and just have mono or poly entries in the same context? Seems easier to keep them separate though.

What is the meaning of this?


\x. let y = x

That is the question.

what is the type of "let" is perhaps more interesting. The first argument of "let" is clearly a "Name" (kind?). The next argument is an expression that is to be bound to the Name, and the final argument is the body, or the scope in which the name is bound. We could say something like:

let :: (y : Name) -> (x : Expr) -> (z : Expr) -> (z [x/y] : Expr)

So "let y = x" would be a function from "z" to "z [x/y]", and "\x . let y = x" would be a function from "x" to "z -> z [x/y]". So the type should be something like:

(\x. let y = x) :: x -> z -> z[x/y]

What the algorithm gives is:

{} |- {{x : a} |- y : a} (\x . (y = x)) : (a -> (b -> b))

Note: I think this type is actually wrong.

Craziness

This seems to confuse meta-levels. Are you going to do the same thing with lambda? (It could also be viewed as a function that takes a name and an expression)

In my pet language, I have a system of extensible syntax that would allow 'let' to be added as a new construct with a type along the lines of your 'let' typing. But the way you're mixing the value level in with the construct/macro level seems kind of crazy to me.

Are you thinking about this as a sort-of macro system? Or module system? Or something else?

EDIT: Reworded some parts of this... previous version sounded hostile when I read it to myself.

Absract not crazy

I disagree that this is crazy. I have not given precise syntax or semantics, as I am trying to communicate an idea. Why not collapse meta levels? That is precisely what Prolog does:

typing(Env, lam(Var, Body), Cxt, arrow(A, B)) :-
    typing(Env, Body, C1, B),
    split(Var, C1, Def, Cxt),
    unifyall(Def, A).
typing(Env, app(Fun, Arg), Cxt, B) :-
    typing(Env, Fun, C1, arrow(A, B)),
    typing(Env, Arg, C2, A),
    append(C1, C2, Cxt).
typing(Env, var(X), Cxt, Type) :-
    member(X, Env, ty2(Cxt, Type)).
typing(Env, var(X), cons(def(X, Type), nil), Type) :-
    not_member(X, Env).
typing(Env, let(C, Rhs, Body), Cxt, Type) :- ...

So right there we have "let(X, Rhs, Body) -> Env, Cxt, Type", based on the in/out modes of the Prolog program. If we call the output a "Typing" then Rhs and Body are also "Typings" so we can rewrite that:

let(X is a variable_name, Rhs is a typing, Body is a typing) -> Result is a typing

Then we can Curry and change our syntax a bit:

let :: (x : Name) -> (rhs : Typing) -> (body : Typing) -> (result : Typing).

Unless you are referring to my putting "Expr" instead of "Typing" above, in which case the typing is a property of an expression, so you could easily pass the expression and derive the typing.

There is a connection with macro's here, in that you might define:

(let y = x) == (\z . let y = x in z)
(\x . (let y = x)) == (\x . (\z . let y = x in z))

And like a macro, any free use of 'y' in 'z' would be captured by the let definition (although not uses of 'x' as only 'y' would be in the polymorphic context). The implementation is different though as it exports the definition of 'y' in a polymorphic context. I am uneasy about this, and think it might be a bad idea, which is why I wanted to discuss it here. It adds greatly to the complexity of the type inference algorithm, and it might be better to restrict the polymorphic context/environment to top-level definitions as is traditional. This avoids the problem as you cannot have a 'lambda' outside a top level definition, unless you have module arguments.

Not crazy

I just wanted to convey a strong negative reaction. My intuition is that this is a bad idea. I don't much care for that higher-order encoding in Prolog, either, though this seems a worse. What happens here:

let f = \x. (let y = 1; y)

print (let y = 2; let z = f 0; y)

Probably Bad.

I agree its probably a bad idea, but I want to explore the idea fully just in case. The best argument against it is that it is unnecessary, as any definition that is not top-level does not need to be composable anyway. I too have an intuitive dislike for the substitution with variable capture that happens, but I don't think there is anything technically wrong with it.

We get y = 2, and the following typing (I have defined some type constructors and with an integer kind to make it easier to follow):

(f = (\x . ((y = one) y))) (((y = two) (z = (f zero))) y)

1. [var]		[] {} |- one : 1
2. [let y (1)]		[{} |- y : 1] {} |- (y = one) : (a -> a)
3. [var]		[] {y : a} |- y : a
4. [app (2) (3)]	[{} |- y : 1] {} |- ((y = one) y) : 1
5. [abs x (4)]		[{} |- y : 1] {} |- (\x . ((y = one) y)) : (a -> 1)
6. [let f (5)]		[{} |- f : (a -> 1), {} |- y : 1] {} |- (f = (\x . ((y = one) y))) : (b -> b)
7. [var]		[] {} |- two : 2
8. [let y (7)]		[{} |- y : 2] {} |- (y = two) : (a -> a)
9. [var]		[] {f : a} |- f : a
10. [var]		[] {} |- zero : 0
11. [app (9) (10)]	[] {f : (0 -> a)} |- (f zero) : a
12. [let z (11)]		[{f : (0 -> a)} |- z : a] {} |- (z = (f zero)) : (b -> b)
13. [app (8) (12)]	[{} |- y : 2, {f : (0 -> a)} |- z : a] {} |- ((y = two) (z = (f zero))) : (b -> b)
14. [var]		[] {y : a} |- y : a
15. [app (13) (14)]	[{} |- y : 2, {f : (0 -> a)} |- z : a] {} |- (((y = two) (z = (f zero))) y) : 2
16. [app (6) (15)]	[{} |- f : (a -> 1), {} |- y : 1, {} |- z : 1] {} |- ((f = (\x . ((y = one) y))) (((y = 
two) (z = (f zero))) y)) : 2

What about this?


let badid = (let y = 1; \x. x)

print (let y = 2; badid y) 

It seems like there should be a hygiene issue hiding somewhere.

Looks okay


(let badid = ((let y = one) (\x . x))) ((let y = two) (badid y))

1. [var]		[] {} |- one : 1
2. [let y (1)]		[{} |- y : 1] {} |- (y = one) : (a -> a)
3. [var]		[] {x : a} |- x : a
4. [abs x (3)]		[] {} |- (\x . x) : (a -> a)
5. [app (2) (4)]	[{} |- y : 1] {} |- ((y = one) (\x . x)) : (a -> a)
6. [let badid (5)]	[{} |- badid : (a -> a), {} |- y : 1] {} |- (badid = ((y = one) (\x . x))) : (b -> b)
7. [var]		[] {} |- two : 2
8. [let y (7)]		[{} |- y : 2] {} |- (y = two) : (a -> a)
9. [var]		[] {badid : a} |- badid : a
10. [var]		[] {y : a} |- y : a
11. [app (9) (10)]	[] {badid : (a -> b), y : a} |- (badid y) : b
12. [app (8) (11)]	[{} |- y : 2] {badid : (2 -> a)} |- ((y = two) (badid y)) : a
13. [app (6) (12)]	[{} |- badid : (a -> a), {} |- y : 1] {} |- ((badid = ((y = one) (\x . x))) ((y = two) (badid y))) : 2

It should be okay as it works compositionally, so the polymorphic definitions on one side are matched against the monomorphic requirements on the other.

So in terms of an inductive proof, if the let puts a definition in the polymorphic environment, variables put a requirement in the monomorphic environment, lambdas remove matching variables from the monomorphic environment, and apply removes variables from the monomorphic environment that match the polymorphic definitions of the other side. This intuitively seems sound. Of course details like a let and lambda using the same variable name need to be dealt with. Also the question of how far up definitions should propagate are interesting, as it is good to allow forward definitions. Scope of lamdas is obvious, but for let's I am not sure?

Something HM doesn't accept

I don't know if this is a good thing or not, but here is a valid code fragment this can type that HM cannot:

((\x . (let y = x)) (\z . z)) (y 1, y true)

typing ::= "[", poly_context, "]", "{", mono_context, "}, "|-", term, ":", type;
poly_context ::= {"{", mono_context, "}", "|-", term, ":", type};
mono_context ::= {term, ":", type};

1. [var]		[] {x : a} |- x : a
2. [let y (1)]		[{x : a} |- y : a] {} |- (y = x) : (b -> b)
3. [abs x (2)]		[{} |- y : a] {} |- (\x . (y = x)) : (a -> (b -> b))
4. [var]		[] {z : a} |- z : a
5. [abs z (4)]		[] {} |- (\z . z) : (a -> a)
6. [app (3) (5)]	[{} |- y : (a -> a)] {} |- ((\x . (y = x)) (\z . z)) : (b -> b)
7. [var]		[] {y : a} |- y : a
8. [lit]		[] {} |- 1 : Int
9. [app (7) (8)]	[] {y : (Int -> a)} |- (y 1) : a
10. [var]		[] {y : a} |- y : a
11. [var]		[] {} |- true : Bool
12. [app (10) (11)]	[] {y : (Bool -> a)} |- (y true) : a
13. [prd (9) (12)]	[] {y : (Int -> a), y : (Bool -> b)} |- ((y 1), (y true)) : (a * b)
14. [app (6) (13)]	[{} |- y : (a -> a)] {} |- (((\x . (y = x)) (\z . z)) ((y 1), (y true))) : (Int * Bool)

I have tried changing the notation slightly to see if it helps? I have moved the polymorphic context to the far left of each line and used '[]' brackets, followed by the monomorphic context, then "|-" followed by the term and the type. I have also included the empty brackets even when a context is empty. Appreciate comments on if this is more readable, or suggestions for better notation.

The type is inferable because the definition of "x" is inside the left scope before y is used in the right scope.

Note: The original "Rank 0 Intersection Typing" algorithm would not accept this (it accept only the same complete programs as HM), it is only this variant with the polymorphic-context and curried-let that has this difference.

Well. At least you gave a syntax. Polycontext?

Well. You gave a syntax. But the syntax was somewhat clear from the examples.

It still doesn't explain the double turnstile neither do I have any intuition what a polycontext should be. Since they are both contexts I'ld prefer if you write them both left of the turnstile with curly braces. And I'ld prefer if you would replace the turnstile in the polycontext with something else. Guesstimating it is something of a constraint I'ld prefer a Haskell like doube arrow.

Then the example. Why in God's name does the y escape the scope? Are you going for Lisp?

Currying the let? Yeah. If you want to treat a let as a statement but then I wouldn't know the semantics of it, neither will it likely pan out since it'ld probably need a polymorphic identity type instead of a monomorphic type.

It isn't bad if you type better than HM. But HM is a 'trivial' scheme with a very limited application domain regarding the treatment of generalization; if you want to do it better it should solve some of the problems with HM.

As it stands, you don't have a scheme which does it better than HM but a mess.

Polymorphic Context

Lets break this down, and see where you think the problems are. The polymorphic-context is for definitions. Its the compositional equivalent of HMs polymorphic environment. HM has type-schemas in its polymorphic environment, we have monomorphic-contexts and types.

Consider a code fragment:

f = x y 
x = z y

This 'defines' f and x. This 'requires' z and y. The definitions of 'f' and 'x' can be polymorphic, as like in a 'let' we have the type to generalise. the types would be:

{x : a -> b, y : a} |- f : b
{z : c -> d, y : c} |- x : d

Combining these monomorphic contexts results in the definition of x in one being used in the other, which gives d = a -> b and a = c.

{y : a, y : b, z : b -> a -> c} |- f : c,
{y : d, z : d -> e} |- x : e

I am using the turnstile in the same way it is used for a single typing judgement, the monomorphic context on the left and the type on the right.

The polymorphic context is a map from variable name to typing, so the turnstile would occur in each typing inside the polymorphic context. So we have a set (the polymorphic context) of objects that are typings under the operation of typing unification. That is we can extend unification on types to unification on typings. This all seems straightforward to me so far. Do you think these typings would be better written with "=>"?

The polymorphic context is a map from variable name to typing

No idea why that would be the case and there is nothing in your syntax to warrant that claim.

Why do you call it a polymorphic context when there are no quantifiers? Am I to believe all quantifiers are implicit? If so, how? Are all variables quantified? Where?

Looks like bullshit to me.

How would you like it?

Of course its a map from variable name to typing. What else would it be?
The variable name is in the ususal place on the right of the "|-" and before the ":". I suppose you want me to add a second redundant copy of the variable name on each line? Why not get rid of the turnstile and colon now which you didn't seem to like anyway, so we just have the type at the end.

{
    f : {y : a, y : b, z : b -> a -> c} c,
    x : {y : d, z : d -> e} e
}

Each of these is a typing, and as I explained before each variable is implicitly quantified (providing it is free in the rest of the typing). Something like this:

{
    f : forall a b c . {y : a, y : b, z : b -> a -> c} c,
    x : forall d e . {y : d, z : d -> e} e
}

It is fairly traditional to omit the quantifier at the beginning of the line, HM, System-I and System-E all do this. In a complete typing we also have a monomorphic context (the requirements) and the type of the overall term, so for a code fragment like this we get the typing (and we could write requirements on the left and exports on the right, using the double arrow as you suggested):

f = x y
x = z y
q w

forall f g . {
    q : f -> g,
    w : f
} => g => {
    f : forall a b c . {y : a, y : b, z : b -> a -> c} c,
    x : forall d e . {y : d, z : d -> e} e
}

Here is an example that shows a non-free variable in a typing:

\y . (
    f = x y
    x = z y
    q w
)

forall a d e . {
    q : d -> e,
    w : d
} => (a -> e) => {
    f : forall b . {z : a -> a -> b} b,
    x : forall c . {z : a -> c} c
}  

The first of these written on one line, without the foralls would look like:

(f = x y) (x = z y) (q w) : {q : f -> g, w : f} => g => {f : {y : a, y : b, z : b -> a -> c} c, x : {y : d, z : d -> e} e}

rearranging and adding turnstiles back into typings:

{{y : a, y : b, z : b -> a -> c} |- f : c, {y : d, z : d -> e} |- x : e} {q : f -> g, w : f} |- (f = x y) (x = z y) (q w) : g

Using the turnstile is after all the usual way to write typings.

Indecent Proposals

No logician, I imagine, will accept any proposal with more than one turnstile. What you call a typing simply signifies that given a number of facts, the type for a term is derivable. Apart from that, okay, syntactically you're allowed to do anything, your proposal with multiple turnstiles in a typing will just convince a logician you have a confused mental state.

The notation with explicit quantifiers is far more lucid than anything I've seen from you so far and seems to hint that you're doing standard HM with deferred unification. The burden is upon you to prove that. But, if so, that's mostly uninteresting. In practical applications of HM the order of unification is important mostly only for generating clear error messages at the right position; anyone can change the unfication order. And deferring to the latest moment doesn't seem like a good strategy to me.

You're not solving any HM problem. When to generalise? In what order to unify? How to treat bisimilar, or structurally equivalent, types? How to expand HM to Nominal or structural typing? These are interesting questions. A bottom up procedure for unification, well, maybe. If you feel like writing a prove and a formal treatment of your system. It's a toy otherwise.

Inconsistent

Referring to page 2 of "The Essence of Principal Typings":

Given: a typable term M.
There exists a judgement A |- M : t representing all possible typings for M.

Principal typings allow compositional type inference...

Also see:

http://homepages.dcc.ufmg.br/~camarao/ml-has-pt.pdf

Which justifies the claim to principal typings.

Kfoury routinely uses the turnstile symbol in typings see:

http://www.cs.cmu.edu/~jpolakow/systemE.pdf

Even a proof tree is a succession of turnstiles, and trees can be represented by bracket notation, so I just don't understand the objection to multiple turnstiles. Its just like:

A |- E : t    A |- F : s
------------------------
    A |- E F : s -> t

could be written:

((A |- E : t) (A |- F : s) A |- E F : s -> t)

But in any case the turnstile is obviously a distraction, so I'll get rid of it. I am not over keen on having explicit foralls everywhere, and I think the implicit quantification is fairly obvious. So perhaps this is the best format:

(f = x y) (x = z y) (q w) : {q : f -> g, w : f} => g => {f : {y : a, y : b, z : b -> a -> c} c, x : {y : d, z : d -> e} e}

What do you think of the use of the double-arrow?

There are reasons why I want to defer unification, like incorporating first class polymorphism and other things. Carlos Camarao and Lucılia Figueiredo my have proved the principal typing property in the above paper. The changes to have a compositional bottom-up polymorphic environment are fairly trivial.

Really most of this discussion has been about the notation, which is just changing a print statement, and to quite Bruce Lee: “Its like a finger pointing away to the moon. Dont concentrate on the finger or you will miss all that heavenly glory.”

I suppose you think the type system of C++, C, Ada, Python etc are toys? It seems somewhat inconsistent with your usual attitude to academic vs practical languages.

Do you think C++, C, Ada, and Python have HM typing?

Sorry. But if anyone claims C, C++, or Ada have HM typing that would just boil down to an academic rewrite of history. No idea about Python.

Again, HM is a trivial scheme which concentrates on where to generalise a type to a polytype. It's a dubious decision whether you should do that around a let (except for as a academic abstraction), it doesn't handle structural equivalent types (which is somewhat of a default of the Algol languages), doesn't handle datatypes, and it is furthermore dubious whether it has any value at all beside as an academic treatment of what other languages did (better). It also doesn't seem to have much of a role outside of languages which have a REPL.

You keep pointing to that you should be able to generalize the use of one turnstile to allow for more turnstiles in a notation. Most people will simply consider that an abuse of notation since it disagrees with standard sequent calculi. Well, I would agree since it simply makes more sense to see your system as a sequent calculus and it makes a lot more sense to conflate, in a standard manner, the both contexts into one context.

There just isn't any reason to have three abuses of notation: a) multiple turnstiles, b) multiple contexts, and c) implicit foralls when in standard notation it is completely clear what you're doing.

(Moreover, in one of the examples you seem to confuse object and meta logic.)

No.

I did not claim they have HM typing, I don't know where you got that from.

In any case, I will change the notation just to keep you happy, although you are still staring at the finger. Then at least the discussion need not go in circles about notation.

I am not sure I understand your objection to let? We need some kind of scoped definition? Or is it just generalising let's that you object to? Wnen you say it doesn't handle datatypes, I don't really understand what you mean, and you seem to be changing the subject somewhat.

Where do you think I confused object and meta-logic? Why is this even a problem, dependent types seem to collapse object and meta-logic into a single logical framework.

Syntax and logic

A |- E : t    A |- F : s
------------------------
    A |- E F : s -> t

could be written:

((A |- E : t) (A |- F : s) A |- E F : s -> t)

The above is a confusion of object and meta logic, something you didn't do until now. The first states a deductive rule over an object (sequent calculus) logic. The second is just a syntactic expression. (Of course, this all depends on a standard interpretation of symbols.)

I have no objection about let. It's a syntactic construct, equivalent to an abstraction, used to signal that a type should be generalized, well, usually. But if you write down some examples it's dubious that you'll always want to generalize around a let, and at the same time there are examples where you'ld like to generalize an abstraction, maybe.

But I like strongly typed static systems with explicit type annotations. You can simply type more and there doesn't seem to be a good reason to generalize any monotype anywhere. So I am completely not interested in HM typing. It concentrates on a dubious triviality and doesn't interest me.

C++ auto

You realise of course that practically you do need type inference. C++ had to introduce 'auto' because nobody wants to type:

vector<vector<map<int, set<vector string>>>>::iterator = thing.begin().

As for the "confusion", I am sorry but:

A |- E : t    A |- F : s
------------------------
    A |- E F : s -> t

Is just a syntactic expression. Its just a bunch of ascii characters representing what is really a 'proof tree'. And there are many well known ways to represent a tree syntactically, of which nested brackets is a perfectly valid one. Besides that is just something that Gentzen made up. What about Jaskowski's notation, or that of Fitch?

I would accept the argument the above is more conventional, traditional or whatever, but to give one particular notation magical properties, like it represents object logic or meta logic seems dogmatic.

I can see no good reason to use different notation for object logic and meta logic (apart from to make it clear to the reader which is which). In any case a dependently typed logical framework treats them uniformly with the same notation.

HM is hardly the only system

HM is hardly the only system which does type inference. And, again, it only showcases a scheme about generalizations around a let. Why would it be interesting? It doesn't do a lot and what it does it doesn't do that well.

It's maybe nice for functional scripting languages or a REPL but its importance is completely blown out of proportion.

I missed your point.

I understand where you are coming from now. I though you were saying "without X your algorithm is just a toy implementation of HM", and i pointed out that C++ etc also do not have "X". But its HM itself that you have a problem with.

In my experience with trying to write generic programs in C++ and Ada, I have found their type systems to be inconsistent and limited in unnecessary ways, as you would expect of ad-hoc systems, especially where parametric and generic types are concerned.

Intersection type systems permit too much, they will just type anything that is syntactically valid (or infer the types).

So from my experience what I want is:

- A decidable, parametric, polymorphic, quantified type system
- type inference. I at least want type inference to be decidable, however if the user wants to write type-level programs that are not decidable that is also okay.

HM seems to sit at a sweet spot. It can generalise function definitions (and let is realy just a local function definition) which you need, and lambdas make callbacks and higher-ordered programming easier.

Anything that is not monomorphic or a function-definition (which is polymorphic) can use higher-order types.

So I find HM to be the best solution to the problem that I have come across so far. All I am looking to do is to tweek it a little, so that I have principal-typings and can do compositional type-inference. The point is not to defer-unification, but to do it symmetrically eliminating left-to-right bias.

I think I have decided against the weird curried let though. I think just the plain-old-let is simpler. Trying to do this is just adding too much complexity for very little gain. It was an interesting experiment, but I think the original version in the "Rank 0 Intersection Typing" thread is what I am going to go forward with. There would still be a polymorphic-context, but it would only contain top-level definitions, and would not be part of the typings.

Academic hype surrounding a triviality

We can discuss a triviality ad nauseum but I disagree there is a sweet spot anywhere in sight. Ad-hoc is just a point of view and there is every reason to find HM an ad-hoc system just as much as the type system of C and Ada is.

They introduced a let since they couldn't type standard LC. Where is that not a hack? Why would I even go for a user defined 'generalize here' notation? And then, sometimes it generalizes too much, sometimes it doesn't generalize enough. (You might have solved that problem; that's why I find your scheme interesting. But you keep changing notation, therefore I don't get it yet.)

The let generalizes both around the introduction of a global variable and the introduction of local variables. Well, that's maybe interesting from an academic perspective but it is a dubious strategy from a PL perspective.

I believe in (FP) languages where global variables are explicitly typed and local variables don't generalize. No idea whether that will pan out but I don't buy much into HM since there isn't much to discuss in the first place.

As I said, HM is a hyped triviality where you might even wonder whether the scheme even does what it is supposed to do that well. Not interested.

C++ and Ada Generics

Just defining something simple like an Iterator is painful (or impossible) in Ada. C++ templates are untyped, but if they were to have a type system, you would want a quantified parametrically polymorphic type system.

I have no intention of limiting the type system to HM, but rather it will be a quantified, parametrically polymorphic type system with first class polymorphism, of which the HM subset will be able to use type-inference.

Let-polymorphism

I can see you want that. But I don't see a reason for the central feature of HM: let-polymorphism.

nested functions

If you want to define a local function for use within a code block, you want it generalised, the same way top level functions are generalised. Without let polymorphism there is no way to define a polymorphic local function, or infact any local function, you would have to pass it in as a lambda and add a type signature. You would need let even if it was monomorphic. In fact Haskell may do exactly that and remove generalisation from let and have it monomorphic for inference unless specifically typed.

You want it generalised

There is no reason to assume that. I have never seen a local definition where you want the type to be generalized; on the contrary, I have only seen local definitions where you don't want the type generalized.

You seem to be stuck on the academic 'id' example; in practice, it doesn't seem to occur. For good reason, if a function is 'generalizable' it is lifted to a global definition by most programmers.

Let can be monomorphic

So, I could not generalise 'lets' it wouldn't make a lot of difference. You still need to have generalisation at the top level. Also you can generalise when a let or lambda has an explicit type annotation or you have first-class polymorphism.

My only concern would be this violates the principle of minimum surprise. People will be confused when local functions behave differently from top level functions.

I think so

I really do believe it doesn't make much sense to generalize local lets. But I'ld need a corpus of functional programs to show that.

Minimum surprise is probably better served without generalizing a let. If local definitions are never, or hardly, ever used in a generalizable fashion you'll probably catch code errors better since the monomorphic types match the usage of terms better.

Besides, I doubt most programmers are even aware a let is supposed to generalize.

The question is what a generalization is at the top level. An explicitly denoted type probably needs to follow the rules of (higher-ranked) types with explicit foralls. But I am somewhat unsure about that, but I don't have a better scheme than a higher-rank notation.

sound like a weird restriction to me

i woulda thought there would be times when i write a generic/type-parameterized function, and want to have little utility functions via let/where (i personally prefer where most of the time!) that therefore must also be generic. doesn't seem like something that should be outright prevented, if it is possible to support it. i mean as a Regular Joe programmer this would make me think the language kinda sucked, even if there are theoretically good reasons for it :-)

The compiler loves type specific functions

You pay for generalization every time a function is generalized instead of kept monomorphic. Even if you would later specialize a function to a concrete implementation, you pay substantially in running time.

let f = \x. x in f 5

The compiler loves that f has type int->int. It can optimize functions like that very well. Furthermore. You pay for generalization. You pay for instantiation. You pay severily for the optimization pass to derive the monotype for f. You pay a lot.

Your idea of "little utility functions" is probably a dream. I don't see it happening, a lot, in practice.

So you have a feature you A) don't need, and B) pay severely for. Big no.

(At best it should be an option, not a default.)

Generalisation does not have to cost.

The compiler can easily output different machine code for different uses of a generic function, so there would not be a cost as long as the actual type is known at compile time. This is really the same as C++ templates. If you don't know the type at compile time then you have to use a generic definition anyway, and you are paying the cost for a reason. The cost is only that of a C++ virtual method call in any case.

In reality they won't be fully generic, as you point out the only useful function with a type 'a -> a' is 'id'. But parametric types will occur often:

process :: Thing a -> Thing a -> Thing a

Or constrained generic like:

factorial :: Number x => x -> x

Or generic over the container type:

filter :: Container x => x Int -> x Int

There are many cases when writing a genetic function where you may want to name something to make the code more readable.

In summary it has zero runtime cost to generalise a function and infact template code often runs faster than functions because template instantiation can happen inline, providing the type is statically determined. It will have a tiny cost for the compiler, instantiating types is pretty fast as it is. If it becomes a problem then the types could packed into a contiguous memory block, so that instantiation is a single memcopy.

In a local let I am paying for nothing

It's mostly the typing system's algorithm and the specialization where you pay. So, it's the compiler running time where you pay. This is one of the discrepancies between compilers which do 10kLoc/s and 10MLoc/s. Scala is very slow, Go is light-speed fast. It matters and the academic "shouldn't cost too much" attitude is one of the reasons most academic compilers don't perform.

If Stroustrup is right about Scala performance it will probably not scale beyond 20kLoc comfortably and die another graceful academic death.

A default let where a let definition occurs every fifth line or so with a "generalize here" semantics is costly. OCaml is okayishly fast on source code but I think that's partly due to the fact that it tries to generalize lets but ends up with monotypes anyway. It still does a lot of work for nothing.

Template instantiation can maybe be made fast but is one of the reasons C++ is slow. For every instantiation it probably goes over all the source files and instantiates a type term again. Hence also "auto". Nice for programmers but also saves a very costly instantiation which needs to be checked. Contrary to what you think, instantiation in C++ is very costly. And my bet is that "auto" is as much, or even more, a performance upgrade as well as a convenience mechanism.

Lastly, I was discussing local lets, not global definitions. Big difference and you turned this discussion into another straw-man argument. And when you don't generalize local lets and explicitly type global definitions I don't feel I am still discussing HM.

Don't care about compiler speed

I don't care about compiler speed, or at least I am happy that the implementation I am working on will be fast enough. The compositional nature of the type system will allow full incremental partial recompilation, so if you edit only one source file, just that file gets recompiled and re-linked. Of course there is a catch here, as this is a new language, I don't have 10 million lines of code to test on.

I don't care whether the type system is HM or not. I am trying to find the right type system for my language. If you persuade me let generalisation is a bad idea, then I won't have let generalisation.

At the moment the only argument appears to to be compile time, and as I have said, that is not a big factor for me. Runtime performance is important, and the abstraction power of the code are both important to me. You have not provided any reasons for not having let generalisation that affect my priorities and requirements.

Have it your way

A long time ago when people were still experimenting with compilers and actually doing real stuff I was learned one law of compiler technology:

You shouldn't pay for the features you don't use.

If you think you can get away with it, be my guest but my observation on compiler writers who think they can go against pretty iron-clad compiler laws is pretty negative.

If I didn't convince you yet you have a feature you both pay for and don't need. Yeah well, it stops.

I agree

I totally agree that you shouldn't pay for a feature you don't use, but I really don't think the cost is that great in this case.

I think I will have to try and construct a test with a bunch of lets, where the variable gets used monomorphically and see just how much difference it makes in type inference if I change the algorithm to generalise or not.

Would Hackage be sufficient for you?

I really do believe it doesn't make much sense to generalize local lets. But I'ld need a corpus of functional programs to show that.

Would Hackage be sufficient for you? GHC hackers have their own motivations for not generalizing local lets, but they do show on the way that the feature is used seldom anyway.

See https://ghc.haskell.org/trac/ghc/blog/LetGeneralisationInGhc7 and especially the cited papers, like Sec. 4 of http://research.microsoft.com/en-us/um/people/simonpj/papers/constraints/let-gen.pdf. If you want more, see the journal version.

How's This

So does this notation look good?

\y . ((f = x y) (x = z y) (q w))

1. [var]		x : forall a . {x : a} a
2. [var]		y : forall a . {y : a} a
3. [app (1) (2)]	(x y) : forall a b . {x : (a -> b), y : a} b
4. [let f (3)]		(f = (x y)) : forall a . {f : forall b c . {x : (c -> b), y : c} b} (a -> a)
5. [var]		z : forall a . {z : a} a
6. [var]		y : forall a . {y : a} a
7. [app (5) (6)]	(z y) : forall a b . {y : a, z : (a -> b)} b
8. [let x (7)]		(x = (z y)) : forall a . {x : forall b c . {y : b, z : (b -> c)} c} (a -> a)
9. [app (4) (8)]	((f = (x y)) (x = (z y))) : forall a . {f : forall b c d . {y : b, y : d, z : (d -> (b ->
c))} c, x : forall e f . {y : e, z : (e -> f)} f} (a -> a)
10. [var]		q : forall a . {q : a} a
11. [var]		w : forall a . {w : a} a
12. [app (10) (11)]	(q w) : forall a b . {q : (a -> b), w : a} b
13. [app (9) (12)]	(((f = (x y)) (x = (z y))) (q w)) : forall a b . {f : forall c d e . {y : e, y : c, z :
(c -> (e -> d))} d, x : forall f g . {y : g, z : (g -> f)} f, q : (b -> a), w : b} a
14. [abs y (13)]	(\y . (((f = (x y)) (x = (z y))) (q w))) : forall a b c . {f : forall d . {z : (c -> (c ->
d))} d, x : forall e . {z : (c -> e)} e, q : (a -> b), w : a} (c -> b)

I have combined the contexts, made it more obviously a map and added explicit quantification. Of course you still get nested contexts where you have a typing inside the context is this okay? We also have both imports (the free variables in a fragment) and exports (the definitions in a fragment) in the same context, is this okay?

Edit: So I think I have decided the added complexity is not worth it, and to go back to the original form of 'let'. I have also removed the term from each line, as its somewhat redundant, as its implied by the rule in square brackets.

\y . (let f = x y in let x = z y in q w)

1.  [var x]           {x : a} a
2.  [var y]           {y : a} a
3.  [app (1) (2)]     {x : (a -> b), y : a} b
4.  [var z]           {z : a} a
5.  [var y]           {y : a} a
6.  [app (4) (5)]     {y : a, z : (a -> b)} b
7.  [var q]           {q : a} a
8.  [var w]           {w : a} a
9.  [app (7) (8)]     {q : (a -> b), w : a} b
10. [let x (6) (9)]   {q : (a -> b), w : a} b
11. [let f (3) (10)]  {q : (a -> b), w : a} b
12. [abs y (11)]      {q : (a -> b), w : a} (c -> b)

As all the type variables are quantified at the far left in this simpler version, can I get away without the foralls (I have tried to keep the other changes you suggested)?

Monotype and Polytype

I don't see why [var] would introduce a polytype anyway. You don't seem to make a distinction. It's either all monotypes or all polytypes.

HM introduces polytypes only around lets, it's not called let-polymorphism for nothing.

You probably can get the same behavior with monotypes only but that's another discussion.

var introduces a unique type variable.

The var rule introduces a unique type variable each time. Things become monomorphic by unification. An abstraction unifies all the uses of that term variable, making it monomorphic. A let binding unifies each use with a copy of the polymorphic type with fresh type variables. A definition introduces a polytype, as well as a let binding.

You didn't even show you have mono- and polytypes

I have been waiting for people with more knowledge about type theory to get them to clean up your notation and reasoning but no luck so far.

So, I'll give it another shot. First, I've been critical about everything including HM so it is unfair to restrict my critique to notational issues only. But, having said that, we cannot even start to discuss your algorithm if the notation, and your reasoning about it, isn't clean to boot. I don't do catego^H^H^H^H discuss feelings.

If you don't introduce foralls somewhere around your types you don't have polytypes, PERIOD. The notation where a monotype is restricted probably can be understood to denote a polytype in a let rule from a HM perspective, but you have to discuss that. Not me.

I was just wondering whether polytypes can escape a let term; with an explicit notation that's easier to study. No idea, as I said I don't know HM that well. As I said before, HM is a trivial scheme where a (by forall) bound variable signals it can be reinstantiated, for better or worse.

Lastly, I would prefer you bring back the term in the derivation. And if you don't have polytypes, then drop the forall.

Is this okay.

So, is the following okay, its for the original 'let' bindings, and has the terms and explicit foralls added back in:

1.  [var x]           x : forall a . {x : a} a
2.  [var y]           y : forall a . {y : a} a
3.  [app (1) (2)]     (x y) : forall a b . {x : (a -> b), y : a} b
4.  [var z]           z : forall a . {z : a} a
5.  [var y]           y : forall a . {y : a} a
6.  [app (4) (5)]     (z y) : forall a b . {y : a, z : (a -> b)} b
7.  [var q]           q : forall a . {q : a} a
8.  [var w]           w : forall a . {w : a} a
9.  [app (7) (8)]     (q w) : forall a b . {q : (a -> b), w : a} b
10. [let x (6) (9)]   (let x = (z y) in (q w)) : forall a b . {q : (a -> b), w : a} b
11. [let f (3) (10)]  (let f = (x y) in (let x = (z y) in (q w))) : forall a b . {q : (a -> b), w : a} b
12. [abs y (11)]      (\y . (let f = (x y) in (let x = (z y) in (q w)))) : forall a b c . {q : (b -> c), w :
b} (a -> c)

It does not look as neat as the version above without terms and forall, but I guess its worth the extra characters if it is more understandable. Do I need the foralls?

Do you introduce polytypes around lets?

You don't gain anything by (outermost) quantifying all type statements. If you never introduce a polytype of course you don't need them.

The question in the back of my mind is whether a part of a type

{ x: a } f: a -> b

should be read as

{ x: a } forall b. f: a -> b

Explicitly denoting that b is the type variable which can be instantiated multiple times.

(But I am still watching soccer.)

No need

As there is no 'b' in the context the forall b can be moved to the beginning without changing the meaning. As in:

forall a b . {x : a} f : a -> b

I could possibly also read it as

But the downside of moving all foralls outerward is that it doesn't signal anything meaningful anymore whereas a HM type does.

You don't seem to get the point that a forall should convey meaningful information. If it doesn't, then don't write it.

I could read your type as:

forall b. {x: a} a -> b

In your notation without foralls I still assume that you can only generalize the variables which are unbound in a context. Am I correct in that? You don't have to introduce a forall to make that notion explicit but it gives somewhat more readable types. And links your scheme to HM which you said is equivalent. Your words, not mine.

As I said about five times now: if a forall doesn't convey any meaningful information then don't write them.

Prenex Normal Form and Impossible Typings

Repeating something doesn't make it any more correct. See below:
https://en.wikipedia.org/wiki/Prenex_normal_form
As you can see you can convert any quantifiers to prenex normal form without losing any information.

You can generalise all the variables, typings are equivalent up to alpha renaming, so:

{q : (b -> c), w : b} (a -> c)  === {q : (d -> f), w : e} (d -> f)

The type you give is impossible:

{x: a} a -> b

There is no way to get a 'b' in the type without introducing it in the context. (Variables can occur in the context that are not in the type however).

Irrelevant comment

A type system for a PL is basically a statement about how a unification scheme should work. I'll completely disregard your comment about prenex normal forms as too far fetched, thank you.

That typings are equivalent up to alpha renaming was clear from your scheme. Interesting but probably very costly was my internal observation.

If you do a scheme which is equivalent to HM at some place somewhere you'll abstractly introduce type variables which correspond to polymorphic variables, even if they only exist in an intermediate polytype.

So, can you explain with an example which variables correspond to polymorphic variables and how you decide that?

(Historical mandatory comment on the world games: Boring soccer games at the moment. Getting better though.)

PNF is relevant.

A type system is a mathematical thing and obeys laws like algebra. Addition is commutative whether you like it or not. In this case universal quantification can be moved into prenex normal form without changing the meaning or removing information from a type. Nothing far fetched about it. You may as well say you don't like seeing equations in a factorised form.

We don't care whether a variable is mono or poly. We always add it with a fresh type variable. So let's look at typing "x x".

We find the first x and assign it a fresh type variable:

{x : a} |- x : a 

Or without the turnstile:

x : {x : a} a

We then encounter the next x:

x : {x : a} a

Note as both these are quantified types, these are different 'a's. This becomes obvious after we use the apply rule:

x x : {x : a, x : a -> b} b

So each x has a different monomorphic type in the context. We don't need to decide if x is mono or poly until we know the definition of x. If we eventually encounter a lambda all the definitions of x get unified:

\x . x x

Unify beta with 'a' and 'a -> b'. Note these 'a's are the same as this is within one monomorphic context. Obviously fails occurs check.

However if we find:

\x : c -> c . x x

Unify 'c -> c' with fresh variables with 'a' and with fresh variables again with 'a -> b'. Succeeds with x having the type 'a -> a' as we effectively unify a and b.

Let is a bit of a red-herring as we could have a monomorphic let which uses type annotation to make it polymorphic as in:

let x = id in x x -- fails
let x : a -> a = id in x x -- passes

But then if you can generalise x why would you not? Principal types and principal typings mean there is a most general form, which means that sort of generalisation is safe, and there are many filter functions that can be generic especially when combined with type classes.

You can see that an intersection type system passes 'uses' of variables up to the definition, and quantified type systems pass the definitions down to the uses (in the environment). This is kind of half way between, definitions pass down and uses pass up to the point where the meet each other compositionally (IE generalisation happens at the point where the code fragment with the definition and the code fragment with the uses are combined.

Still looks like a mess

Well. Even if you can get it to work, which I doubt, I think it's going to be too expensive to use in practice.

As far as "why not generalize everything," see above post. Moreover, I can see where you have your polytypes. The way I look at HM, you have solved a non-problem with a non-solution.

I don't do mathematical philosophy. When it comes to syntactical systems, almost all "deep" insights immediately fail. Maybe you feel like doing another "Univalence Theorem?"

Anyway. I give up. Your goalposts shift every post. Test your ideas against a corpus or write a proof.

The Facts

- It is tested and works. The source for the type inference algorithm is on GitHub, you can download it if you want to test it yourself. As I am happy it works, there is no point in discussing this unless you have a counter example.

- I have not encountered any efficiency problems. If you are aware of any please specify. The number of unifications is going to be the same as Algoritm-W and the C++ code with union-find does these pretty quickly.

- I don't see where you have actually given a reason to not generalise let bindings (apart from performance which I disagree with).

- If you don't understand why the forall can be moved to the front without changing the meaning, I am not sure why we are having this conversation. How can you be so pedantic about the use of the turnstile and so lax about the use of the universal quantifier. This seems inconsistent to me.

- I am not interested in writing a proof at this time. Why waste time proving something that is incomete and may change? When I have a complete type system for my language then is the time to write a proof. As for testing, I have done lots, but I am obviously interested in any counter examples.

- My goals have not changed, so if you can provide anything to support that comment I would be surprised. The whole thread is here, where have my goals changed?

The bullshit

- I don't buy that it works. You've changed the notation and implementation that many times that we are still not discussing anything.

- I have said countless times that your forall doesn't seem to signal any meaningful information, which in that case, shouldn't be written. And I have asked you to explain the rationale behind where you derive that a variable can be handled monomorphically or polymorphically which you have failed to do.

- You have ignored all critique except for some notational issues which you turned into a straw-man argument. Moreover, cherry-picked every argument you feel comfortable about without presenting anything except for confused abductive remarks. You don't seem to know whether your typing standard simply typed LC or LC+let or LC+let with explicit annotations.

- A test where you run a type checker on short terms is meaningless. Your alpha renaming which seems to occur everywhere is one of the most costly schemes I can think of. And probably you'll need to do a renaming and a merge around every AST term. Moreover, your scheme holds a large number of implicit equalities around. You just don't seem to understand the basic of writing a type checker both from a performance perspective as well as a theoretical perspective. I doubt it'll pan out but I also don't care.

- You come with mathematical philosophical claims that the algorithm should work. I know math is a religion to you but algorithms don't suddenly start working by praying at them.

You haven't provided an algorithm, you have changed the algorithm a number of times, you have tested against a number of examples and probably broken these tests during you changes. We're not discussing anything except your idea that it probably should work which seems to be grounded in religion.

(Obligatory note. Another day of crazy games at soccer. Kick the vampire!)

I'm in the wrong room.

This isn't an argument, its just contradiction.

Here is the algorithm, now tell me something useful (note this has not changed from the earlier post on Rank 0 intersection types):

eq(X, X).

neq(X, Y) :-
    answer_source(X, eq(X, Y), Solver),
    get(Solver, np),
    stop(Solver).

member(Key, cons(def(Key, Val), Tail), Val).
member(Key, cons(Head, Tail), Val) :-
    member(Key, Tail, Val).

not_member(Key, List) :-
    answer_source(X, member(Key, List, X), Source),
    get(Source, np),
    stop(Source).

append(nil, L, L).
append(cons(H, T1), L, cons(H, T2)) :-
    append(T1, L, T2).

unifyall(nil, T).
unifyall(cons(T, Tail), T) :-
    unifyall(Tail, T).

split(X, nil, nil, nil).
split(X, cons(def(X, T), Tail1), cons(T, Tail2), Tail3 ) :-
    split(X, Tail1, Tail2, Tail3).
split(X, cons(def(Y, T), Tail1), Tail2, cons(def(Y, T), Tail3)) :-
    neq(X, Y),
    split(X, Tail1, Tail2, Tail3).

typing(Env, lam(Var, Body), Cxt, arrow(A, B)) :-
    typing(Env, Body, C1, B),
    split(Var, C1, Def, Cxt),
    unifyall(Def, A).
typing(Env, app(Fun, Arg), Cxt, B) :-
    typing(Env, Fun, C1, arrow(A, B)),
    typing(Env, Arg, C2, A),
    append(C1, C2, Cxt).
typing(Env, var(X), Cxt, Type) :-
    member(X, Env, ty2(Cxt, Type)).
typing(Env, var(X), cons(def(X, Type), nil), Type) :-
    not_member(X, Env).

In this post I have been discussing a modification to that algorithm that provides polymorphic types from definitions. I don't have the Prolog for the modified algorithm, but it has not changed at all, and I do not understand what makes you think it has changed? All i have changed is how the debug traces are printed, because you made suggestions on how they could be improved.

This type system is for typing a language I am developing, so the syntax is whatever I want it to be, however founding the semantics on something with a solid mathematical foundation like lambda calculus makes sense to me. I can have let, or I can not have let, it can be monomorphic or polymorphic, I haven't decided. You can view these as variations on the core type system. I can extend the algorithm to correctly infer the types for each of these possibilities. So think of it more as a family of algorithms rather than a single one. I will be extending it with more features, some experimental in the future. Really you have to try expressing some algorithms and problems in the language to tell whether a feature is making things better or worse.

Your question about whether a variable is monomorphic or polymorphic puzzles me. Do you not understand intersection types?

Example 1. of HM Wikipedia page

As I stated before HM has intermediate types with both polymorphic and monomorphic variables. You guesstimated that your scheme is equivalent to HM.

In your scheme, a type for id like a->a can be used, if somehow typed magically, I don't understand where it dropped from in one of your examples, by copying it multiple times. That's, I guesstimate, the same as instantiating a polymorphic type. But since you don't have quantifiers around a type like that, I expect, is treated either fully polymorphic (copied) or monomorphic (unified).

Which begs the question what your scheme does with an LC term which has a mixed monomorphic and polymorphic type. If it doesn't handle that case well, you've likely got an exploitable hole.

I get the idea a bit behind intersection types but it reminds me too much of the design decision between structural and nominal typing. And as far as I know, most structural schemes tend to explode on normal programs. So I have a "okay academic but will likely never pan out" feeling with it and can't get myself to become interested.

Compositional Inference 101

If you don't understand how intersection types work, then I can see this might all be a bit mysterious. If you are struggling with the notation and the concept then perhaps reading Kfoury and Wells' Principality and type inference for intersection types using expansion variables would help (you can find more at http://www.macs.hw.ac.uk/~jbw/itrs/bibliography.html). Lets look at how id would be typed. First applying id in a code fragment without a definition:

(id 1, id True, id x)

A1.  [var id]         {id : a} a
A2.  [lit 1]          {} Int
A3.  [app (1) (2)]    {id : (Int -> a)} a
A4.  [var id]         {id : a} a
A5.  [var true]       {} Bool
A6.  [app (4) (5)]    {id : (Bool -> a)} a
A7.  [var id]         {id : a} a
A8.  [var x]          {x : a} a
A9.  [app (7) (8)]    {id : (a -> b), x : a} b
A10. [prd (6) (9)]    {id : (Bool -> a), id : (b -> c), x : b} (a * c)
A11. [prd (3) (10)]   {id : (Int -> a), id : (Bool -> b), id : (c -> d), x : c} (a * (b * d))

Although we are using a multi-set for the monomorphic context, the type of id is in effect a rank-1 intersection type, so you could write this using '&' for intersection (and I will add back the turnstile and term so this makes sense on as a line by itself):

{id : (Int -> a) & (Bool -> b) & (c ->d), x : c}   |-   (id 1, id True, id x) : (a * (b * d))

Now the definition of 'id' from another code fragment would look like this:

id = \z . z

B1. [var z]        {z : a} a
B2. [abs z (1)]    {} (a -> a)
B3. [def id (2)]   {} => {id : {} (a -> a)}

Note the contexts so far in '{}' are requirements, so the code fragment needs a definition for the variable. 'def' is not an expression but a statement and statements (and blocks of statements) can export definitions '<>'. As this type system is compositional we should be able to type the following program using only the information in A11 and B3:

id = \z . z;
(id 1, id True, id x)

B3. [def id (2)]      {} => {id : {} (a -> a)}
A11. [prd (3) (10)]   {id : (Int -> a), id : (Bool -> b), id : (c -> d), x : c} (a * (b * d))

Which can be done by unifying the type of 'id' exported from B3 with each of the required types of 'id' in A11. We use a different copy of the type of 'id' with fresh variables for each unification:

unify (a1 -> a1) (Int -> a)   =>   a1 = Int, a = Int
unify (a2 -> a2) (Bool -> b)   =>   a2 = Bool, b = Bool 
unify (a3 -> a3) (c -> d)   =>   a3 = c, d = c

applying these to A11 gives:
{id : (Int -> Int), id : (Bool -> Bool), id : (c -> c), x : c} (Int * (Bool * c))

and delete 'id' from the requirements gives:
{x : a} (Int * (Bool * a))

This operation is the statement composition rule, but in effect this is the same thing that happens with:

let id = (\z . z) in (id 1, id true, id x)

If instead the definition of 'id' had been monomorphic, something like:

(\id . (id 1, id true, id x))

The abstraction rule would apply to 'A11':

A11. [prd (3) (10)]   {id : (Int -> a), id : (Bool -> b), id : (c -> d), x : c} (a * (b * d))

assign a fresh type for id, and unify with each use of id:
unify a1 (Int -> a)   =>   a1 = Int -> a
unify a1 (Bool -> b)   =>   a1 = Bool -> b 
unify a1 (c -> d)   =>   a1 = c -> d

Which would obviously fail to unify.

Shifting Goalposts

I don't have to understand intersection types, which I already stated I am not interested in, that well to understand your scheme. Well's project is dead which conffirms my believe that their representation is both too costly as well as explodes on normal programs.

You asked for comments, you got them. My first task was to get your notation clean such that at least we have something to talk about. It's better than what you started with but still isn't clean enough to be discussed meaningfully. I again suggest that you use Well's notation, which means reintroducing the term in a derivation.

I already informally, despite your notation, somewhat got what you do around an abstraction and a let. For the third time, I want to see it ran on Example 1 of the HM Wikipedia page. (You did it once but you were mucking around with a curried let then.)

I am starting to suspect you don't run your algorithm on the example I requested since you cannot type it.

Easily Done.

Sure, the only reason I didn't have it above was to keep the size of the post down after discussing all the 'id' stuff. I removed the term from each line because I find it more readable, and I suspect I am the one that is going to be using it the most :-), but just for you this is (I think) the notation you were referring to:

let bar = \x . let foo = \y . x in foo in bar

1. [var x]            x : {x : a} |- a
2. [abs y (1)]        (\y . x) : {x : a} |- (b -> a)
3. [var foo]          foo : {foo : a} |- a
4. [let foo (2) (3)]  (let foo = (\y . x) in foo) : {x : a} |- (b -> a)
5. [abs x (4)]        (\x . (let foo = (\y . x) in foo)) : {} |- (a -> (b -> a))
6. [var bar]          bar : {bar : a} |- a
7. [let bar (5) (6)]  (let bar = (\x . (let foo = (\y . x) in foo)) in bar) : {} |- (a -> (b -> a))

As you can see, we get the same type as HM "{} (a -> b -> a)" which writing it like Wikipedia "forall a b . a -> b -> a", but as previously discussed the universal quantifier at he beginning is unnecessary.

Might pan out but not convinced

Well. If you ask for comments you cannot maintain the position that your notation is for internal use only, can you? It's, on the contrary, I find, your task to present a maximally readable notation for a broad audience (of two, at the moment.)

I personally would prefer if you write it like this. There doesn't seem to be a reason to deviate from the standard (but I can work with the other notation too.)

4. [let foo (2) (3)]  {x : a} |- (let foo = (\y . x) in foo) : (b -> a)


In the above typing I expect that {x:a} signifies that a should be treated different from b which isn't bound in the context. A simple copying and substitution mechanism on types probably breaks an invariant here. (For example, I expect you cannot simply copy a in the result type; I'll give it more thought later.)

But now we're at the point where your notation is readable but your algorithm is underspecified. Even if there is a problem, you can probably always get it to work; possibly, if necessary, at the cost of an (expensive) analysis of the context and type.

But might I ask in what direction you want to take this? If it's just for fun, be my guest. There seems to be a renewed interest in intersection types but I am a bit wary of that. Academia neither does, or documents, critical thought or negative results; it's a bit of a story-telling place. Which means that the same dead-horse idea will often pop up every ten years or so, and this looks like one of those ideas.

If it's just for fun, be my guest, but I think it's unlikely it'll pan out. Even if you can get it to be theoretically correct I find it very unlikely it'll work out in a pragmatic PL setting.

(I guess I should try and find a counterexample but, well, football is more interesting at the moment.)

Wells' Notation

Wells puts the term at the beginning, and I think its reasonable to understand it as, "you can derive this type given this term and context". So if its acceptable, I would prefer that to having it in the middle.

The algorithm is given in Prolog, a logic language, which pretty much amounts to a formal specification. I am not sure how much more precisely specified it could be?

I am not sure what you mean by 'copy a'? In any case there are many optimisations that can be done to this, and we all know what Knuth says about premature optimisation. I think there are some clever ways you can make it so that you don't have to copy or alpha-rename to instantiate a new unique type.

I am happy for this to be just for my own education and amusement, but I wouldn't object to other people using it if they wanted.

Oh. I informally look at a

Oh. I informally look at a polytype, which you don't seem to have, as something in the context which needs to be copied and substituted. No idea where I got that from, probably from writing a type-checker somewhere; or just a mental model developed during programming.

I don't find Prolog that readable to accept as a specification. Then again, most academic type systems are probably flawed somewhere so an implementation is a good thing.

In the types you give there seem to occur variables which should play different roles during the analysis. Maybe by compositionally you have that all variables in a type can be treated equivalently but that seems unlikely to me. Which would imply an exploitable hole.

Maybe you should ask someone else about it at this point. There are type theorists here and I am watching soccer. Very good game, atm.

Examples for Validation and Optimisation

If I wanted to look at optimisation and performance, the first thing I would need is some examples to test on. For SAT there is SATLAB as well as a few other similar places where I have been able to find examples. Is there something similar for lambda calculus?

If not, are there any good ways to procedurally generate lambda examples?

Procedurally generate lambda examples

This is a fine idea^tm. It should't be to hard to code a generator by hand. And I personally would like to see algorithms accompanying papers run against a corpus.

Maybe you can even exploit something like, what was it, Quickcheck?, for it; but that's maybe to wild.

But no, I don't think it exists.

Random Lambda Term Generator

I generated the following lambda term with a simple generator using de-Bruijn numbers so that valid variable references are always generated. I then ran it several times until it produced a large term. I realise it is a bit long, but I wanted to post it in case anyone else wanted to try type inference on it for comparison with other systems.

Here's the example I used:

((\t0 . t0) ((\t0 . (((((\t1 . (\t2 . ((\t3 . t1) ((t2 (\t3 . (\t4 . t3))) (t1 t
1))))) (\t1 . ((t0 (\t2 . ((\t3 . ((\t4 . t4) (\t4 . t2))) t1))) (\t2 . t0)))) (
\t1 . (t0 (\t2 . (\t3 . ((\t4 . t0) ((\t4 . ((((((\t5 . (((\t6 . (t5 t1)) ((\t6 
. (\t7 . t1)) (t5 (\t6 . ((\t7 . t2) (((\t7 . (((\t8 . t6) t5) t2)) (\t7 . (t2 (
t1 ((\t8 . t1) t6))))) (t5 ((t5 t4) t6)))))))) ((t2 (\t6 . t0)) ((\t6 . ((\t7 . 
(\t8 . (t0 (\t9 . ((\t10 . t3) ((\t10 . t3) (((\t10 . (\t11 . t4)) ((\t10 . (\t1
1 . (\t12 . ((t5 (t6 (((t3 t12) t4) (t11 t2)))) (((\t13 . ((((((\t14 . ((\t15 . 
(\t16 . (\t17 . ((((t2 (((((t0 (\t18 . (\t19 . t9))) t2) (\t18 . t15)) ((t14 (\t
18 . t13)) (\t18 . (\t19 . (\t20 . (((t12 (\t21 . ((((t20 ((\t22 . (\t23 . t9)) 
(t18 (\t22 . ((t11 t12) (t7 t12)))))) t13) ((\t22 . ((\t23 . t23) (t15 (\t23 . (
\t24 . (((((t1 (\t25 . t3)) (t10 (\t25 . (\t26 . (\t27 . t14))))) t20) t4) t17))
)))) (\t22 . (((t15 (((t7 (((\t23 . t5) t10) (((\t23 . ((\t24 . ((((t17 t4) (\t2
5 . t17)) t7) ((((\t25 . (((t18 ((t18 ((t21 t25) ((((\t26 . ((((\t27 . (\t28 . t
5)) ((\t27 . ((\t28 . (\t29 . ((((((\t30 . ((t11 ((\t31 . (t27 (((t19 (\t32 . ((
\t33 . (\t34 . t27)) t30))) (t1 t13)) t4))) t10)) t5)) (((\t30 . t21) ((t20 t13)
 ((\t30 . t15) ((\t30 . t22) t8)))) (\t30 . (\t31 . (\t32 . t2))))) t2) (\t30 . 
(\t31 . (\t32 . ((((\t33 . t9) (\t33 . ((\t34 . (\t35 . ((\t36 . (\t37 . (\t38 .
 (t15 (\t39 . t2))))) ((((t20 ((\t36 . t22) (t34 t17))) ((\t36 . t2) ((\t36 . (\
t37 . (t29 (\t38 . t25)))) (t21 t30)))) (\t36 . (t23 (t25 (\t37 . t10))))) ((\t3
6 . (t24 (\t37 . t16))) t25))))) ((\t34 . (\t35 . t15)) t2)))) (\t33 . ((\t34 . 
(\t35 . (\t36 . (\t37 . ((t1 ((\t38 . (\t39 . (\t40 . (\t41 . t29)))) (t24 (t8 (
\t38 . t20))))) t23))))) (t29 (\t34 . t8))))) (\t33 . t14)))))) t12) (t3 t4)))) 
(t16 (t23 t2)))) t6)) (t16 (\t27 . ((t26 (\t28 . (\t29 . t24))) (\t28 . ((t3 (\t
29 . ((((\t30 . t4) ((t11 t10) ((\t30 . t16) ((((\t30 . t17) ((t15 (\t30 . ((\t3
1 . t30) (\t31 . ((\t32 . ((\t33 . ((\t34 . t22) ((t32 t26) t8))) (((\t33 . (\t3
4 . t23)) (\t33 . ((\t34 . (\t35 . t19)) (\t34 . t6)))) ((((\t33 . t11) (((\t33 
. t24) t19) (((\t33 . t25) ((\t33 . ((\t34 . t26) t3)) t21)) t17))) (\t33 . ((((
\t34 . (\t35 . ((\t36 . (t31 (\t37 . (\t38 . t24)))) t30))) (((t21 (\t34 . ((t25
 (\t35 . (t1 ((\t36 . t19) ((((t11 t15) t2) ((\t36 . ((\t37 . t16) t23)) ((\t36 
. (t2 t0)) t28))) (t35 (\t36 . t29))))))) ((t11 (t34 t7)) t21)))) (t14 t29)) t20
)) ((t19 (t27 (\t34 . (((t14 (\t35 . (\t36 . t17))) (\t35 . t10)) (\t35 . (t31 (
t35 (t14 t3)))))))) (t25 (\t34 . (t16 (t2 ((t30 (\t35 . (\t36 . t8))) ((\t35 . (
t5 (t25 ((t27 (\t36 . ((\t37 . t23) t23))) (t8 t0))))) (t2 (t27 ((((\t35 . ((((\
t36 . t36) t26) (t11 (\t36 . ((((\t37 . (((\t38 . t2) t2) t19)) t10) ((\t37 . (\
t38 . (t13 (((\t39 . (t14 t28)) (\t39 . t23)) t35)))) t10)) (t30 (\t37 . (t25 t1
))))))) ((t30 t22) t12))) (\t35 . (((\t36 . (t2 ((t25 (t10 ((((((((\t37 . ((t2 t
36) (\t38 . (\t39 . t22)))) (t30 (\t37 . ((\t38 . (t37 ((\t39 . ((\t40 . (((t14 
t1) t6) (\t41 . t0))) (\t40 . t9))) (\t39 . t11)))) (\t38 . ((\t39 . t36) t24)))
))) t3) (\t37 . t17)) (\t37 . ((((t8 t25) (\t38 . ((t38 t9) t10))) (t29 (\t38 . 
(\t39 . t22)))) (\t38 . (\t39 . (t35 t7)))))) (((((\t37 . (\t38 . ((\t39 . (((t3
8 (t13 (((t16 t10) (\t40 . (\t41 . (((((\t42 . (t19 t27)) ((\t42 . (t20 (t39 t19
))) t20)) t4) ((t36 t10) (\t42 . ((t20 (\t43 . (\t44 . ((\t45 . ((t4 t16) (((\t4
6 . ((t17 ((\t47 . ((\t48 . t12) (t37 (\t48 . (t24 t25))))) ((\t47 . (t11 t22)) 
(t10 t43)))) (t41 t42))) (t10 (\t46 . (t31 t1)))) t7))) t38)))) (((t21 ((\t43 . 
(t23 (\t44 . ((\t45 . (\t46 . t5)) t2)))) t41)) (\t43 . t18)) (\t43 . t42)))))) 
(\t42 . ((((\t43 . t37) ((\t43 . (((\t44 . t40) ((t1 (\t44 . (\t45 . t28))) t34)
) t38)) ((t30 ((\t43 . (\t44 . (\t45 . ((t7 ((t22 (((\t46 . t33) (t6 t11)) (\t46
 . ((t2 (\t47 . t13)) ((t15 ((t34 t35) t28)) (\t47 . t7)))))) (\t46 . (\t47 . t4
2)))) ((t13 (\t46 . (t35 t6))) (\t46 . t40)))))) t14)) (\t43 . ((t5 (\t44 . t32)
) t0))))) (((\t43 . (((\t44 . (\t45 . t37)) (\t44 . ((\t45 . (((((t36 ((t11 (\t4
6 . t11)) t15)) t18) t32) (\t46 . t13)) (t45 (\t46 . ((t8 (t18 (\t47 . t42))) (t
46 (\t47 . t35))))))) (((t17 t15) ((\t45 . (t33 t33)) (\t45 . (((t37 t36) ((\t46
 . (((((t32 t40) t25) t25) (t11 t17)) ((\t47 . t4) t32))) (((t40 t41) ((((t36 t0
) ((\t46 . t26) t45)) t11) t10)) t18))) (t14 (\t46 . t11)))))) ((((\t45 . t44) t
5) (\t45 . t18)) t31))))) ((\t44 . (\t45 . ((\t46 . (\t47 . t26)) ((((((\t46 . (
t0 t28)) t42) ((\t46 . ((\t47 . (((\t48 . (t19 ((\t49 . t0) ((\t49 . t16) (((t0 
(\t49 . (\t50 . t13))) (((t38 (t5 (\t49 . (\t50 . ((\t51 . ((\t52 . (\t53 . (t0 
(\t54 . (t34 t11))))) ((t1 ((t46 (((t45 t49) t31) ((t21 (\t52 . t28)) t21))) (((
\t52 . (\t53 . t35)) t16) (\t52 . t29)))) ((t19 t2) t24)))) ((t36 ((t21 (\t51 . 
(\t52 . (\t53 . (t18 t19))))) (\t51 . (((((\t52 . (t31 (\t53 . t35))) (((t26 (t9
 ((\t52 . ((((\t53 . t21) (\t53 . ((\t54 . ((t48 ((\t55 . ((((\t56 . ((t10 t45) 
t12)) (t46 t31)) (\t56 . (\t57 . ((\t58 . (\t59 . t49)) t14)))) (\t56 . ((\t57 .
 t41) t35)))) (\t55 . (((\t56 . (\t57 . (((\t58 . (\t59 . t0)) t10) (t41 (((((t3
7 (\t58 . (\t59 . (t9 t45)))) t41) (\t58 . ((\t59 . (((\t60 . (t39 (((\t61 . ((\
t62 . (t23 (((t58 ((t39 ((((((\t63 . (((t39 t3) t37) t17)) t3) t26) t58) t49) ((
(\t63 . t37) t62) t7))) (\t63 . t9))) (\t63 . ((t49 ((\t64 . t39) t53)) t33))) (
(\t63 . t62) t35)))) (t21 (t4 t37)))) (t51 ((\t61 . ((t22 ((t2 (t7 ((\t62 . (t39
 t45)) t18))) t40)) t8)) (t7 (\t61 . (((\t62 . (\t63 . (\t64 . (\t65 . (\t66 . (
((\t67 . t44) (\t67 . t65)) (\t67 . ((((\t68 . ((\t69 . (((t21 (\t70 . (\t71 . t
69))) t59) (t16 (\t70 . t10)))) t59)) t53) t15) (((t34 t1) ((\t68 . (((\t69 . (\
t70 . (\t71 . ((t36 (\t72 . t53)) t41)))) t26) t37)) t47)) (\t68 . (\t69 . (\t70
 . (t53 t25))))))))))))) t32) t0)))))) ((t49 t41) t36)))) t4) ((\t60 . ((\t61 . 
((t55 ((((t55 t60) (\t62 . t49)) t58) (\t62 . (t43 (\t63 . (\t64 . t33)))))) t18
)) (\t61 . t28))) ((\t60 . ((((\t61 . ((\t62 . t9) (t9 (\t62 . t25)))) t34) t40)
 t42)) t44)))) (\t59 . ((((\t60 . (t2 (\t61 . (((\t62 . (\t63 . ((t0 t6) t54))) 
t54) ((t26 t30) t15))))) (\t60 . t52)) (\t60 . ((\t61 . t60) (\t61 . t19)))) t29
))))) (\t58 . ((\t59 . ((t28 (((\t60 . t23) (t50 t27)) (\t60 . (\t61 . t17)))) (
t51 ((\t60 . ((((\t61 . t31) ((\t61 . (\t62 . (\t63 . t8))) (\t61 . ((\t62 . ((\
t63 . t13) t55)) (((\t62 . (\t63 . (\t64 . t49))) t58) ((\t62 . ((t36 ((t16 (\t6
3 . t29)) t11)) t11)) (\t62 . t9))))))) ((\t61 . t48) t21)) (\t61 . (\t62 . (\t6
3 . (\t64 . (((\t65 . ((t65 (((\t66 . (\t67 . (t35 ((\t68 . (\t69 . ((\t70 . (((
(\t71 . (\t72 . ((((\t73 . ((\t74 . t50) (\t74 . (t5 t66)))) (\t73 . t5)) ((t15 
t53) t22)) (\t73 . (\t74 . (\t75 . ((\t76 . (((\t77 . (\t78 . ((((\t79 . t5) t44
) (\t79 . (\t80 . t37))) t77))) (\t77 . ((\t78 . t46) t35))) (((((t13 (\t77 . (t
76 t19))) t4) ((\t77 . (((\t78 . ((t1 (((\t79 . t37) ((t44 (t10 ((\t79 . ((\t80 
. (t54 (\t81 . ((\t82 . t46) t75)))) t39)) (t74 t66)))) t67)) t10)) t28)) (t14 t
60)) t25)) (\t77 . (t61 (t49 t11))))) ((\t77 . (\t78 . (t48 (\t79 . t44)))) ((\t
77 . (\t78 . (\t79 . t25))) (\t77 . t26)))) t15))) t26))))))) t51) (\t71 . t9)) 
(t0 (\t71 . t4)))) (\t70 . (\t71 . (t30 ((\t72 . (\t73 . t3)) (\t72 . t38)))))))
) t54)))) ((\t66 . (\t67 . ((\t68 . (\t69 . (\t70 . (((\t71 . t57) t60) ((\t71 .
 (\t72 . t70)) t28))))) (\t68 . ((\t69 . t60) t45))))) ((\t66 . (\t67 . t27)) t6
4))) (\t66 . t64))) (t64 (\t66 . ((\t67 . (((\t68 . ((\t69 . t46) ((\t69 . t56) 
((\t69 . ((t26 t65) (((t18 t69) ((\t70 . (t68 t29)) ((t0 t11) (t13 t19)))) (\t70
 . ((t21 t47) (t52 ((\t71 . t61) (\t71 . ((\t72 . t16) (\t72 . (\t73 . t49))))))
))))) (\t69 . (\t70 . (\t71 . t69))))))) ((t23 (\t68 . (((t49 (\t69 . t21)) ((\t
69 . t36) t65)) ((t20 ((((t56 (\t69 . (\t70 . (t11 (t61 t63))))) (\t69 . (t62 ((
t3 t65) ((t6 (\t70 . (\t71 . (t42 t68)))) (((((\t70 . ((\t71 . t34) ((t13 (\t71 
. (\t72 . ((\t73 . (\t74 . (\t75 . t14))) (((t33 (t11 ((\t73 . (((\t74 . (((\t75
 . t52) ((t50 (((t48 ((\t75 . (\t76 . t65)) ((\t75 . (\t76 . t27)) t22))) t3) t6
3)) (\t75 . t27))) (t8 t49))) t24) t49)) (t28 (\t73 . (((((\t74 . (t36 t49)) ((t
61 (\t74 . t3)) ((\t74 . ((t28 (\t75 . (t40 t37))) (\t75 . t57))) (\t74 . ((t43 
t45) (t47 ((\t75 . ((\t76 . ((t74 (((\t77 . t3) (((t33 (((\t77 . (\t78 . t66)) (
(\t77 . t70) t34)) t2)) (\t77 . ((\t78 . (\t79 . (\t80 . t16))) (((t65 t59) (t51
 t15)) (t17 (\t78 . (\t79 . t21))))))) t56)) t75)) t32)) t59)) (\t75 . t3)))))))
) (\t74 . (((\t75 . (\t76 . t28)) ((t60 t49) ((\t75 . t48) (\t75 . t70)))) t3)))
 t56) ((\t74 . t68) (\t74 . t21)))))))) ((\t73 . ((\t74 . (((((((\t75 . (t63 (\t
76 . t16))) (((\t75 . (\t76 . (\t77 . (\t78 . ((t73 (\t79 . ((\t80 . ((\t81 . ((
\t82 . ((\t83 . ((((\t84 . (((((\t85 . (\t86 . t78)) (t45 (t60 (t14 (t25 t47))))
) (t14 (\t85 . t53))) ((((\t85 . t42) (\t85 . t12)) (t79 t25)) t63)) (t67 t27)))
 t58) t14) (t1 ((\t84 . t82) (\t84 . (\t85 . t63)))))) (((\t83 . t17) t60) (\t83
 . t16)))) (\t82 . t11))) t33)) ((\t80 . t40) t17)))) (\t79 . (\t80 . t36)))))))
 t48) t25)) t49) t73) ((\t75 . ((t13 t8) t53)) (\t75 . (\t76 . t16)))) (t54 t21)
) (t14 (\t75 . (((\t76 . ((\t77 . (t68 (\t78 . (t13 t40)))) (\t77 . (t34 (\t78 .
 t11))))) (t44 t0)) (\t76 . ((t34 (\t77 . (((\t78 . t77) (t46 (t68 t39))) (t69 (
\t78 . ((\t79 . (\t80 . t14)) (\t79 . t46))))))) (t56 t19)))))))) (t34 (\t74 . (
\t75 . t69))))) (\t73 . ((\t74 . (((\t75 . t32) t47) (\t75 . t38))) ((\t74 . (\t
75 . t17)) (t3 t46)))))) t9))))) (\t71 . t43)))) (\t70 . ((\t71 . ((t27 ((\t72 .
 (\t73 . ((t66 (\t74 . t58)) ((t48 (((t62 (\t74 . (((t13 t33) (\t75 . ((t16 (\t7
6 . (\t77 . (\t78 . t63)))) (\t76 . (t72 t41))))) (\t75 . ((((t63 t36) (t56 t51)
) t9) t12))))) (((\t74 . (t29 ((\t75 . (\t76 . (t2 t49))) (t62 ((\t75 . t64) (\t
75 . t64)))))) ((\t74 . (t49 (\t75 . (\t76 . t68)))) (\t74 . (((\t75 . (t56 (t17
 t67))) ((\t75 . t60) ((((t49 t11) (\t75 . (\t76 . ((\t77 . (t63 t66)) (\t77 . (
(\t78 . (\t79 . t39)) t40)))))) ((\t75 . (\t76 . (((t76 t17) (t2 t39)) t29))) ((
((\t75 . (\t76 . (\t77 . (\t78 . (\t79 . (t21 (t5 (\t80 . t24)))))))) (\t75 . (\
t76 . (t61 (\t77 . (t12 (\t78 . (\t79 . (t49 t20))))))))) (t15 t41)) (\t75 . (\t
76 . (t36 ((\t77 . (\t78 . ((((\t79 . t52) (\t79 . t77)) (((t29 ((t63 (\t79 . t5
1)) t27)) (\t79 . (t9 (\t80 . ((t44 (\t81 . (\t82 . t23))) ((\t81 . (\t82 . ((t2
1 (\t83 . (t37 (\t84 . t29)))) t51))) ((\t81 . t37) t7))))))) ((t75 (\t79 . (\t8
0 . ((\t81 . (((\t82 . ((t40 t13) (\t83 . (t5 ((\t84 . t25) (t71 (\t84 . (\t85 .
 (t12 (((((t52 t72) (((t11 (t20 t60)) t26) (\t86 . (t43 t4)))) (((\t86 . ((\t87 
. (\t88 . t19)) t39)) (\t86 . t49)) t8)) (t42 ((\t86 . ((\t87 . (((\t88 . ((\t89
 . t85) ((\t89 . ((\t90 . (\t91 . (\t92 . (t82 (t50 (\t93 . (\t94 . (\t95 . ((\t
96 . t61) t6))))))))) (((\t90 . t72) (\t90 . (t45 t71))) t51))) t25))) t33) (\t8
8 . t76))) t80)) t20))) ((\t86 . t16) (\t86 . t68)))))))))))) (t71 (\t82 . (\t83
 . (t22 ((\t84 . t41) (((t48 (\t84 . (t37 (\t85 . (\t86 . (\t87 . (t82 t74))))))
) t1) (\t84 . ((\t85 . ((\t86 . (\t87 . (((\t88 . (((t9 t47) (t59 ((t72 ((\t89 .
 (t13 t42)) t17)) (\t89 . t63)))) (t43 (((t60 ((\t89 . (\t90 . t85)) ((t19 t67) 
(\t89 . t19)))) t11) ((((\t89 . t41) t56) (\t89 . t13)) (\t89 . t29)))))) (t16 (
(\t88 . (\t89 . (\t90 . (t77 (\t91 . (\t92 . (\t93 . t15))))))) t31))) t68))) (t
10 t67))) ((t23 t56) (\t85 . (t72 t46)))))))))))) (\t82 . (\t83 . (\t84 . (t37 (
(((\t85 . (\t86 . (\t87 . ((\t88 . ((\t89 . t89) ((t48 t84) t6))) (t85 ((t65 ((\
t88 . (t24 (t26 t7))) (\t88 . (\t89 . (t79 (\t90 . (\t91 . t86))))))) t13)))))) 
t0) t69) (t40 t4)))))))) (t57 (\t81 . (\t82 . t7))))))) (\t79 . ((t75 (\t80 . t7
0)) t43))))) (\t79 . (t14 t26))))) ((\t77 . ((t0 t42) t18)) (t59 t64))))))))) ((
(t8 t58) (\t75 . ((((\t76 . (t33 t42)) t44) (\t76 . ((\t77 . (t74 (\t78 . (\t79 
. (\t80 . t22))))) ((t58 t72) (\t77 . (((t28 (\t78 . ((\t79 . (((\t80 . t37) (\t
80 . t73)) (((\t80 . (t19 t40)) (((\t80 . t46) (\t80 . t22)) (t24 t2))) t69))) (
t38 t20)))) t63) ((\t78 . (\t79 . t38)) t13))))))) t20))) (\t75 . t32))))) t62))
)) t26)) t25)) (((\t74 . t8) t42) ((\t74 . (t25 t71)) ((t69 t10) t16))))))) t49)
) ((\t72 . (\t73 . t36)) t9))) (t55 t36)))) (\t70 . t51)) (t32 t12)) t63)))))) (
\t69 . t3)) t60)) ((t3 t38) (\t69 . (\t70 . t44))))))) t37)) t15)) ((\t67 . (\t6
8 . ((\t69 . (\t70 . t13)) (\t69 . (((\t70 . t68) (\t70 . t41)) (t23 (\t70 . (\t
71 . (\t72 . t64))))))))) t10)))))) t56) t21))))))) ((t34 (\t60 . t6)) (t16 ((((
\t60 . t54) (\t60 . t19)) t43) (\t60 . t53)))))))) (\t59 . t27)))) (\t58 . t8)))
))) t6) t9)))) (((((\t55 . ((t12 t24) ((\t56 . (t2 t36)) (\t56 . (\t57 . t19))))
) ((t1 (\t55 . t8)) t4)) t40) t51) t42))) t37))) (\t53 . t45)) t23)) t18))) ((t2
7 (\t52 . (\t53 . (\t54 . (((t47 ((\t55 . t33) t54)) (t54 ((t26 t19) (\t55 . (t5
0 (\t56 . (\t57 . (\t58 . (\t59 . t30))))))))) (\t55 . ((t31 (\t56 . (\t57 . (t3
7 t22)))) t43))))))) t24)) t22)) ((\t52 . (\t53 . (\t54 . t10))) (\t52 . (t17 (\
t53 . (t51 ((t5 (((\t54 . (t28 t37)) (\t54 . (\t55 . (t3 (\t56 . ((\t57 . (t29 (
\t58 . t15))) (((\t57 . t19) t42) (\t57 . ((t22 t23) (\t58 . (t36 t42)))))))))))
 ((t23 (t19 (((\t54 . (t3 t29)) t23) (t29 ((((t31 (t28 t36)) ((t30 (((t33 t41) t
24) ((t7 (((t1 ((((t32 t0) ((t51 ((t16 (((((\t54 . (\t55 . t8)) (t31 (t0 (\t54 .
 (\t55 . (\t56 . t17)))))) t28) (\t54 . (\t55 . (\t56 . (t46 t52))))) (\t54 . t5
1))) ((\t54 . t22) (t17 (t13 (t14 (\t54 . t22))))))) ((t52 ((\t54 . ((\t55 . t40
) t9)) (\t54 . (\t55 . (((\t56 . ((t36 (\t57 . ((\t58 . (t33 (((\t59 . (((\t60 .
 t46) t31) ((\t60 . (\t61 . (\t62 . (t49 (((\t63 . (\t64 . (\t65 . (t16 ((\t66 .
 ((\t67 . t25) ((\t67 . (\t68 . t56)) t47))) (\t66 . t28)))))) t32) (\t63 . ((\t
64 . t42) t23))))))) (\t60 . (t21 t53))))) (((\t59 . (t20 ((((((\t60 . t60) ((\t
60 . ((\t61 . ((t7 (\t62 . t34)) t9)) ((((t4 ((t33 (\t61 . t8)) ((t3 t54) (\t61 
. t37)))) (t38 (t33 t53))) t16) (((\t61 . (\t62 . ((\t63 . (\t64 . (\t65 . ((\t6
6 . t10) (\t66 . (((\t67 . ((t0 (\t68 . (t67 ((t60 (t34 ((t20 t51) t62))) (t49 (
\t69 . (\t70 . ((\t71 . t6) (t18 ((\t71 . (((t63 (\t72 . t41)) (t33 t5)) t58)) (
\t71 . (t45 ((((\t72 . ((t28 t9) (t25 (((t24 (\t73 . (\t74 . ((\t75 . t8) (\t75 
. ((\t76 . (\t77 . (\t78 . (\t79 . t31)))) ((((t9 (\t76 . (((t29 t11) t58) t18))
) ((\t76 . t67) ((\t76 . t42) ((\t76 . t21) (\t76 . ((\t77 . t55) (t28 t28))))))
) ((\t76 . (((((t56 (\t77 . t47)) (((t15 (((((\t77 . t21) (\t77 . t1)) t58) t48)
 t50)) ((\t77 . t33) t20)) (\t77 . (t60 ((\t78 . t12) (\t78 . ((\t79 . (((t71 ((
t52 ((t74 t71) ((t58 (\t80 . ((\t81 . (\t82 . t46)) ((\t81 . t3) t44)))) t60))) 
(t73 t33))) (\t80 . t77)) t59)) (\t79 . (\t80 . t6))))))))) ((\t77 . (\t78 . (\t
79 . (t67 ((\t80 . t57) t15))))) ((t64 t21) t3))) ((\t77 . t24) t7)) t39)) (t74 
(((t52 t5) ((\t76 . (\t77 . (((t5 (t7 ((\t78 . t8) (t28 t25)))) t73) (((\t78 . (
t32 t40)) ((t14 ((\t78 . t48) ((\t78 . (\t79 . t74)) (((\t78 . (\t79 . t75)) t4)
 (t15 ((((\t78 . (\t79 . (t25 t69))) ((\t78 . t54) t55)) t8) t12)))))) ((\t78 . 
(\t79 . (\t80 . (((\t81 . ((\t82 . ((t79 t75) t47)) ((t40 (\t82 . t11)) (\t82 . 
(\t83 . (\t84 . t77)))))) (t62 ((\t81 . t39) (\t81 . (\t82 . t13))))) t48)))) t1
2))) (\t78 . (\t79 . ((((\t80 . (t51 t22)) ((\t80 . t68) (\t80 . t31))) t32) (((
\t80 . (t35 ((\t81 . (t42 ((((\t82 . t70) t6) ((\t82 . (\t83 . (\t84 . ((t9 (\t8
5 . t67)) ((\t85 . t38) t24))))) (\t82 . t25))) t11))) ((\t81 . ((\t82 . (\t83 .
 t73)) (\t82 . t49))) t20)))) t27) ((t21 (t33 t64)) (\t80 . t5)))))))))) (\t76 .
 (\t77 . ((t31 (t0 t71)) (t29 ((\t78 . ((t76 t23) t33)) (\t78 . (\t79 . ((\t80 .
 t23) (\t80 . t16))))))))))) (\t76 . ((t68 ((t61 t73) ((((\t77 . ((\t78 . t26) (
(t53 (\t78 . ((((t18 (((t58 (\t79 . ((\t80 . (((\t81 . t26) (\t81 . (((\t82 . ((
t76 t33) t18)) t50) (\t82 . (\t83 . (\t84 . (\t85 . t8))))))) (\t81 . ((((\t82 .
 t54) t81) ((t76 ((\t82 . ((\t83 . t14) ((\t83 . ((t14 t8) t26)) t42))) t23)) t7
6)) ((\t82 . (t0 (\t83 . t68))) t29))))) ((\t80 . t64) t43)))) t48) (t48 t41))) 
t49) ((t11 t75) t27)) t25))) t50))) t29) ((\t77 . (t62 (\t78 . t52))) t23)) (t29
 (\t77 . ((t20 (((\t78 . t45) t37) t22)) ((t40 (t65 t6)) t69))))))) (((\t77 . t3
3) (t20 t46)) (\t77 . ((\t78 . t49) t52))))))))) ((((((\t76 . (\t77 . (t40 (\t78
 . (\t79 . (\t80 . t24)))))) (\t76 . t18)) (((\t76 . (t16 (\t77 . t71))) (((t27 
t40) (\t76 . t49)) t36)) ((\t76 . t8) t14))) (\t76 . ((\t77 . t31) t8))) ((t72 (
(\t76 . (((\t77 . t66) t61) (t27 t4))) (t40 (t22 (((\t76 . t26) (\t76 . ((((t45 
(\t77 . ((t76 (\t78 . (((((((t73 (\t79 . (t76 t75))) (\t79 . (((((((\t80 . ((((\
t81 . t27) (\t81 . (\t82 . (t14 ((((\t83 . (\t84 . t25)) t68) t20) t0))))) t56) 
(t54 t65))) t78) (\t80 . ((t16 (\t81 . (\t82 . ((t78 (t26 (\t83 . (t22 (\t84 . (
\t85 . t9)))))) t33)))) t79))) (t48 t6)) (t36 ((((((t26 t1) (((\t80 . t55) (t24 
(\t80 . (((\t81 . (\t82 . (\t83 . t16))) (t20 ((t47 t16) (\t81 . (t27 (\t82 . t1
9)))))) t65)))) (t53 t52))) ((\t80 . t50) (t57 ((((\t80 . ((\t81 . t16) (\t81 . 
t22))) (((\t80 . ((\t81 . (\t82 . t30)) (\t81 . t40))) t68) t12)) ((\t80 . (\t81
 . (((\t82 . t74) t19) t4))) t24)) (\t80 . (\t81 . t29)))))) (\t80 . t3)) (t46 t
0)) (\t80 . (t36 t50))))) ((((\t80 . t0) t47) t37) t52)) (t70 (((\t80 . t34) (\t
80 . t46)) t77))))) t72) t14) (\t79 . (t27 t5))) (t14 t49)) (t60 (((t6 (((\t79 .
 ((\t80 . (t41 (\t81 . (t34 t6)))) (t66 t47))) ((\t79 . ((t14 ((\t80 . (\t81 . (
t48 t61))) (\t80 . (\t81 . ((\t82 . t60) (\t82 . (\t83 . (\t84 . (\t85 . (\t86 .
 (t82 (t13 t48)))))))))))) t56)) (\t79 . (\t80 . (\t81 . ((\t82 . (t52 (t70 (t19
 (\t83 . (t66 (\t84 . (\t85 . t22)))))))) t21)))))) (\t79 . (t4 (\t80 . t75)))))
 (\t79 . ((t1 ((\t80 . ((\t81 . t47) ((((\t81 . (\t82 . ((t13 t11) (((t34 (t34 t
58)) t32) (t42 ((t73 (\t83 . t69)) ((\t83 . t62) (\t83 . t77)))))))) t12) (t67 (
\t81 . t58))) t46))) (\t80 . ((\t81 . t8) ((\t81 . ((\t82 . ((\t83 . t48) (t39 (
(((t72 (\t83 . ((((\t84 . t5) t53) (t11 t34)) (t11 (\t84 . (t48 (t2 (t45 ((\t85 
. (\t86 . (t7 t59))) (\t85 . (\t86 . (t61 (\t87 . (t29 ((\t88 . t53) t53))))))))
))))))) t12) (\t83 . (((\t84 . (\t85 . ((\t86 . (t39 ((\t87 . (\t88 . t75)) t75)
)) (t56 ((((\t86 . (\t87 . ((t35 t64) t74))) (t30 (\t86 . t13))) t61) t18))))) (
\t84 . (t57 t8))) t75))) (\t83 . (\t84 . t8)))))) t4)) ((t77 (t63 (((t65 (\t81 .
 (t21 (((t43 (((\t82 . t11) ((t68 (t40 ((t31 (t55 (\t82 . (\t83 . t60)))) ((\t82
 . (t61 t45)) (((\t82 . t50) t69) t70))))) t20)) (\t82 . t14))) (\t82 . (\t83 . 
(t22 (((\t84 . t81) (t42 (t72 t28))) (\t84 . (t56 t79))))))) (\t82 . t60))))) (\
t81 . ((\t82 . (\t83 . t22)) (\t82 . t45)))) t55))) t2)))))) (t14 ((\t80 . ((\t8
1 . t43) t51)) (t34 t61)))))) t38))))) (((t25 (t14 (((((\t78 . ((t6 (t35 (((t11 
t8) ((t51 (t34 (t1 (t42 t65)))) t22)) ((t13 t60) (\t79 . t79))))) t48)) t20) ((\
t78 . (((((t31 t27) (\t79 . (t71 t0))) (((\t79 . t8) (t31 ((\t79 . (\t80 . t63))
 (((\t79 . (t21 (\t80 . (\t81 . t0)))) (((((\t79 . ((\t80 . t35) (\t80 . (\t81 .
 (((\t82 . ((\t83 . (t77 t9)) t79)) ((\t82 . ((t41 (((t80 ((\t83 . (\t84 . t48))
 (\t83 . (\t84 . (\t85 . (\t86 . t48)))))) (t14 (t76 (\t83 . (\t84 . t33))))) ((
t70 ((t32 t36) (t15 t12))) (\t83 . t8)))) (\t83 . t80))) t63)) t63))))) (\t79 . 
t73)) t27) (t36 t11)) (\t79 . (\t80 . (t20 t78))))) t71)))) (t60 (((\t79 . (t8 (
\t80 . t78))) (t6 (\t79 . t38))) t26)))) t56) (\t79 . (t36 ((\t80 . t59) (t71 t3
7)))))) ((\t78 . t11) t14))) (t52 t56)) t10))) (\t78 . (\t79 . (\t80 . ((((\t81 
. (\t82 . ((t60 t28) (\t83 . t81)))) (\t81 . (t44 t17))) (\t81 . (\t82 . (t58 t7
2)))) t7))))) (t10 t9))))) t56) ((\t77 . (\t78 . t58)) (\t77 . (t46 t12)))) (\t7
7 . (t20 ((\t78 . (t46 ((t53 t65) (t32 t58)))) ((((((\t78 . ((t1 ((\t79 . t18) t
32)) (((((\t79 . (t25 (t43 t63))) t5) (\t79 . ((\t80 . (t76 (\t81 . t4))) (\t80 
. t17)))) (((t11 t47) t28) (\t79 . ((\t80 . ((\t81 . ((\t82 . t12) t21)) (\t81 .
 t8))) (t17 t12))))) (t40 (\t79 . t53))))) ((\t78 . (t74 t23)) (\t78 . (\t79 . (
\t80 . t33))))) t56) (\t78 . t39)) t70) (t70 ((\t78 . (t75 t30)) t64))))))))) ((
(\t76 . t53) (\t76 . (((t56 t69) (\t77 . t46)) (t38 (\t77 . (\t78 . (t6 t30)))))
)) (\t76 . ((\t77 . t64) t25)))))))) t72)) ((\t76 . (\t77 . (((\t78 . t10) t35) 
t47))) (\t76 . (((\t77 . ((\t78 . t4) (((\t78 . t47) (\t78 . t62)) t48))) (t34 (
(((t66 (\t77 . (\t78 . (((t5 (\t79 . t63)) t12) t49)))) (\t77 . t76)) (\t77 . (t
22 (t65 (\t78 . (\t79 . (\t80 . (t17 (((t60 (t49 t55)) ((\t81 . (t78 (t27 t23)))
 t17)) t75))))))))) (((\t77 . t6) t61) t56)))) (t28 (t16 (t63 ((\t77 . ((t6 (((t
68 t17) t63) ((((((\t78 . (t13 (\t79 . (\t80 . (\t81 . (\t82 . ((t39 t79) (((t30
 (\t83 . (\t84 . t79))) (((\t83 . (\t84 . (\t85 . (\t86 . t28)))) t27) ((\t83 . 
(t40 (\t84 . (\t85 . t48)))) (\t83 . t45)))) ((\t83 . (t52 (\t84 . (\t85 . t68))
)) (\t83 . t79)))))))))) ((\t78 . (t73 ((\t79 . ((t26 (\t80 . ((t9 (\t81 . (\t82
 . (\t83 . t53)))) (\t81 . (\t82 . ((\t83 . t29) (((\t83 . (\t84 . t52)) t82) t2
))))))) (\t80 . t39))) t14))) t1)) t10) t58) t32) (t57 (\t78 . (t53 (\t79 . t55)
)))))) t1)) t39))))))))))))))) t26) t59)))) ((t52 t60) t15)) t12) (\t72 . t35)))
))))))))))) (\t68 . t3))) ((t64 ((\t67 . (t44 (\t68 . (\t69 . t35)))) (\t67 . ((
t27 (\t68 . t68)) t13)))) (((t14 (\t67 . (\t68 . t57))) (\t67 . t9)) (t14 ((\t67
 . t17) ((\t67 . ((((\t68 . t64) (t10 t20)) t38) (\t68 . (\t69 . (\t70 . t67))))
) (\t67 . t54))))))) ((((((\t67 . (\t68 . t50)) t12) (t9 (\t67 . (t32 (t12 t20))
))) t65) (((\t67 . (\t68 . (t4 (((t68 (t19 t38)) t34) (\t69 . t18))))) (\t67 . t
36)) t15)) (\t67 . (\t68 . ((\t69 . (((\t70 . (t4 t7)) t21) t45)) t46)))))))))) 
t24))) (t45 t5)) t42)))) (((t24 (\t60 . (\t61 . t61))) (\t60 . (\t61 . (\t62 . (
(\t63 . (((\t64 . ((t29 t11) t53)) (\t64 . t29)) (\t64 . t0))) (\t63 . t16))))))
 t56))) (\t60 . (\t61 . (((t60 (t54 (\t62 . t56))) ((\t62 . (\t63 . t33)) (\t62 
. t31))) t11)))) (t41 t45)) ((\t60 . (\t61 . t44)) t8)) (\t60 . ((\t61 . ((\t62 
. (\t63 . t31)) t36)) (t25 (((\t61 . (\t62 . (t60 (\t63 . t18)))) (t38 (\t61 . (
(\t62 . (((\t63 . t14) (t30 t54)) ((\t63 . (((\t64 . t44) t42) t14)) (\t63 . ((t
26 t16) (t50 (t5 (\t64 . t57)))))))) (\t62 . (\t63 . t19)))))) ((t28 (\t61 . t21
)) (((t37 ((t0 (t51 (\t61 . t7))) t32)) (((((((t5 t59) t31) (((t20 (\t61 . (t37 
t8))) ((\t61 . ((\t62 . ((\t63 . (t61 ((t60 (\t64 . ((((t7 t20) ((\t65 . t17) ((
(t13 ((t47 ((\t65 . t9) (\t65 . t34))) (t47 t47))) t11) (\t65 . (\t66 . (\t67 . 
t13)))))) t5) t32))) t52))) t23)) ((t6 ((\t62 . (t36 t39)) t20)) ((((t47 t19) ((
\t62 . (\t63 . ((\t64 . (t31 ((\t65 . (\t66 . t35)) ((t30 (\t65 . (t54 t19))) t1
1)))) t43))) (\t62 . (\t63 . (\t64 . (\t65 . (t8 t1))))))) t32) t33)))) (t9 t40)
)) (((\t61 . (\t62 . t20)) (((t54 t57) t20) t50)) (t3 (\t61 . t38))))) (t7 (\t61
 . ((\t62 . (t57 (\t63 . t25))) t59)))) (\t61 . t41)) t26) ((\t61 . ((t50 ((\t62
 . t20) (\t62 . (((t11 (((\t63 . (t22 (t5 ((\t64 . t44) (t25 ((\t64 . (t60 t12))
 t47)))))) ((\t63 . t24) ((((t14 ((\t63 . (t35 ((\t64 . (\t65 . (((t17 t50) t16)
 ((t42 t42) t48)))) (\t64 . ((t47 ((((t61 t36) (\t65 . t62)) (((t44 ((\t65 . ((\
t66 . t23) ((\t66 . (\t67 . (\t68 . (\t69 . ((\t70 . (\t71 . (t35 t0))) t58)))))
 t36))) t34)) t53) t44)) ((((t27 t14) t46) (\t65 . (\t66 . t32))) (\t65 . t28)))
) ((t26 t6) (\t65 . (\t66 . (\t67 . (t32 (\t68 . (((((\t69 . t37) (\t69 . ((((\t
70 . (t17 (t65 ((t66 t59) t0)))) ((\t70 . t23) ((\t70 . (\t71 . (\t72 . (t60 (\t
73 . (\t74 . (\t75 . ((((\t76 . t23) t11) t62) (\t76 . (((((t40 t51) (\t77 . t13
)) t15) (t62 t11)) (t73 (((\t77 . ((\t78 . (\t79 . (t0 t76))) t1)) ((t46 (t76 t9
)) t29)) ((\t77 . (\t78 . (t38 t13))) t43))))))))))))) (\t70 . (\t71 . (t68 ((\t
72 . (\t73 . ((\t74 . (\t75 . ((\t76 . t16) t20))) t68))) t48))))))) (\t70 . (\t
71 . t42))) (\t70 . t28)))) ((\t69 . (t5 t67)) (\t69 . (\t70 . t65)))) (\t69 . (
t43 t21))) t34)))))))))))) t16)) (\t63 . t12)) t57) (\t63 . ((((t50 t18) (\t64 .
 t1)) t14) (t19 t39)))))) t17)) t52) ((t41 (\t63 . t54)) (\t63 . (\t64 . (\t65 .
 ((t3 t58) t18))))))))) ((t22 (((t61 t29) (t15 ((t6 t42) t15))) ((t31 (\t62 . t3
3)) t12))) (\t62 . (\t63 . (\t64 . (t42 t11))))))) (\t61 . ((\t62 . ((\t63 . (\t
64 . (\t65 . (\t66 . t5)))) t36)) (\t62 . (((\t63 . (\t64 . (\t65 . (\t66 . (t16
 t49))))) ((\t63 . ((\t64 . (\t65 . (\t66 . (t56 (\t67 . ((\t68 . t52) (\t68 . (
\t69 . t17)))))))) (\t64 . (\t65 . (\t66 . ((t60 t11) t56)))))) t37)) t26)))))))
 ((t20 (\t61 . (\t62 . (\t63 . (\t64 . t25))))) ((\t61 . (\t62 . t35)) (t45 (\t6
1 . t28))))))))))))) ((\t59 . (\t60 . ((\t61 . ((t50 t47) t37)) t36))) t20)) (\t
59 . (\t60 . (((t16 (\t61 . (((\t62 . t6) (t10 (t60 ((t50 t8) (\t62 . (\t63 . (\
t64 . t1))))))) (\t62 . t39)))) (t33 ((\t61 . t59) (\t61 . t20)))) ((((((t10 t22
) (\t61 . t49)) (\t61 . t43)) (((\t61 . t15) t51) t44)) (((t58 t40) (t47 (\t61 .
 (t31 t31)))) t2)) t44)))))) (\t59 . t42)))) ((\t58 . t4) t17)))) (t30 (\t57 . (
t57 t52))))) (\t56 . ((\t57 . t44) (\t57 . ((t18 (\t58 . (t21 ((\t59 . (((\t60 .
 t28) ((\t60 . t7) t10)) t26)) t17)))) t14))))) (\t56 . (\t57 . t50))))))) t9)))
 (((\t54 . t3) t15) (\t54 . t1))) ((((((t16 (\t54 . ((t1 (\t55 . (t22 t17))) (\t
55 . (t30 t4))))) t18) (\t54 . ((\t55 . t52) (\t55 . (t28 ((\t56 . (t38 t10)) t1
6)))))) (((\t54 . (t17 (\t55 . ((\t56 . (\t57 . t15)) ((t4 ((\t56 . (\t57 . (t57
 (t48 (\t58 . t6))))) t19)) (t25 ((\t56 . (\t57 . (\t58 . t3))) t50))))))) (\t54
 . t39)) (\t54 . ((\t55 . (\t56 . t14)) (((\t55 . (\t56 . t52)) (\t55 . t10)) ((
\t55 . t55) ((\t55 . t32) ((\t55 . t32) (t20 (\t55 . (\t56 . t6))))))))))) t2) (
\t54 . ((((t44 t43) (\t55 . t47)) ((t29 t3) ((\t55 . (\t56 . t35)) (\t55 . ((t3 
(\t56 . (t11 (((t34 t38) (t38 t29)) t7)))) t53))))) t13))))) (t48 t53)) (\t54 . 
(t33 ((\t55 . (t1 t19)) t34))))) t3))) (t48 (\t54 . t0)))) (t27 (\t54 . t11))) (
\t54 . ((((((t8 t51) (\t55 . t21)) (\t55 . t54)) ((\t55 . (((t52 t6) t14) (((((\
t56 . (((t49 (t34 t18)) (\t57 . t35)) (t50 t21))) ((\t56 . t19) (\t56 . ((((t2 (
(\t57 . (((((\t58 . t57) ((t17 (\t58 . t0)) t19)) t8) t35) (\t58 . ((\t59 . t38)
 (\t59 . t31))))) t40)) t5) t3) (t0 (\t57 . (\t58 . (\t59 . t45)))))))) t41) ((\
t56 . t18) t38)) (t15 (\t56 . (t28 (\t57 . t49))))))) t35)) (\t55 . t32)) (\t55 
. ((\t56 . t29) (\t56 . (\t57 . ((\t58 . ((\t59 . (\t60 . ((t48 t17) (t57 ((((\t
61 . t10) ((t43 t35) (\t61 . (((t34 t2) (((\t62 . (t25 t32)) ((\t62 . t21) (\t62
 . t25))) ((\t62 . (t56 t56)) (t47 t50)))) t5)))) (\t61 . t33)) ((\t61 . t58) (t
6 t25))))))) ((\t59 . (t42 (t28 t11))) ((\t59 . (\t60 . t45)) (\t59 . (((\t60 . 
(\t61 . (t50 (\t62 . (((\t63 . (t3 t13)) t45) (t30 t17)))))) t15) (\t60 . (t53 t
10)))))))) t12)))))))))))) (\t54 . ((((\t55 . t39) (((t18 ((\t55 . ((\t56 . (t55
 ((\t57 . (((t5 ((t4 t51) t16)) (\t58 . (((t0 (\t59 . t19)) t14) ((\t59 . (t43 (
t1 t30))) t18)))) ((\t58 . t31) (\t58 . (\t59 . t58))))) (t51 (t13 t36))))) ((\t
56 . (\t57 . (\t58 . t25))) t13))) ((\t55 . (t35 (t14 ((\t56 . t25) t6)))) (\t55
 . t37)))) (t12 (\t55 . (\t56 . t23)))) ((t40 t40) (\t55 . (\t56 . (((\t57 . t25
) (\t57 . t2)) t20)))))) ((\t55 . (((t13 (t26 (((\t56 . (\t57 . t31)) t32) (\t56
 . t50)))) t52) (\t56 . (\t57 . (\t58 . (t45 (t9 ((\t59 . t39) ((\t59 . (t17 (\t
60 . t8))) (\t59 . (t13 (\t60 . (\t61 . ((\t62 . (((\t63 . (t61 t32)) t6) t55)) 
((((\t62 . (\t63 . t7)) ((t8 t12) (\t62 . t0))) (\t62 . ((\t63 . (\t64 . t40)) (
\t63 . (\t64 . t64))))) (\t62 . t22)))))))))))))))) t46)) t39))))) ((\t54 . t38)
 t49)))))))) (\t52 . (\t53 . t3))) (\t52 . (((\t53 . t26) (t43 t30)) (\t53 . (((
(t22 t31) t21) ((\t54 . (\t55 . t17)) (\t54 . (t1 (t0 (\t55 . t19)))))) (\t54 . 
((\t55 . t39) t0)))))))))) (((\t51 . t7) ((((\t51 . t32) t6) (\t51 . t40)) t40))
 t14))))))) ((\t49 . (t37 ((t26 (((\t50 . ((\t51 . (t5 (\t52 . (\t53 . t39)))) t
34)) t41) (((t45 (\t50 . t18)) t2) t40))) t12))) (\t49 . t33))) t33)) (\t49 . (t
13 (\t50 . (\t51 . t10))))))))) t1) t27)) (t18 (t4 (t19 (\t47 . (t20 ((\t48 . (t
42 t40)) t32)))))))) t35)) (t30 ((\t46 . t1) (\t46 . ((t27 (t43 (\t47 . t45))) t
9))))) t33) ((\t46 . (t38 t28)) t6))))) ((\t44 . t40) (\t44 . (\t45 . ((((\t46 .
 (t21 ((\t47 . (\t48 . t16)) t0))) (((((t39 t17) (t16 (t33 ((\t46 . t33) ((\t46 
. (\t47 . t5)) (\t46 . (t21 (t1 ((\t47 . (t30 (\t48 . ((((\t49 . (t47 ((t11 (\t5
0 . (t14 t38))) ((t43 t7) (\t50 . t26))))) (\t49 . (\t50 . (\t51 . ((t3 (\t52 . 
t50)) t45))))) (\t49 . (t13 t47))) (\t49 . (t13 t44)))))) ((t42 (\t47 . t8)) ((\
t47 . t30) (\t47 . t39)))))))))))) t30) (\t46 . t16)) (\t46 . (((\t47 . t29) ((\
t47 . (t30 (\t48 . t32))) (\t47 . t3))) t41)))) (t14 (t24 ((t37 (\t46 . ((t11 ((
\t47 . (t2 t22)) ((\t47 . t4) ((t7 t35) (\t47 . t39))))) ((\t47 . t1) (\t47 . t4
6))))) (t15 (\t46 . (\t47 . (((\t48 . (t20 ((\t49 . (t2 t20)) (((\t49 . (t21 t36
)) (\t49 . ((\t50 . t47) t6))) t32)))) (\t48 . (\t49 . t33))) (\t48 . (t31 t3)))
))))))) t22))))))) t11) t4)) ((\t43 . t43) ((\t43 . ((t31 ((t13 t12) t26)) ((\t4
4 . t42) (t3 ((((\t44 . ((\t45 . (\t46 . (\t47 . (t1 t22)))) (\t45 . (\t46 . t17
)))) (\t44 . (((\t45 . (\t46 . (\t47 . t23))) (\t45 . (((((t5 (t18 (t1 t14))) (\
t46 . ((\t47 . (\t48 . ((\t49 . (\t50 . (\t51 . (\t52 . ((\t53 . ((\t54 . t43) (
((((\t54 . ((\t55 . (\t56 . t41)) (\t55 . (\t56 . ((\t57 . t25) (t31 (((\t57 . (
((\t58 . (\t59 . t15)) (\t58 . (\t59 . (t52 t20)))) t36)) ((\t57 . t47) t23)) (t
33 ((\t57 . t13) t43))))))))) (\t54 . ((t40 (\t55 . (((t38 t31) (\t56 . (\t57 . 
(((\t58 . (((\t59 . t12) ((t17 ((\t59 . (\t60 . ((t55 t18) (\t61 . (\t62 . (\t63
 . t21)))))) (t19 t12))) (t18 t0))) (\t59 . t59))) (\t58 . (\t59 . (\t60 . (((\t
61 . ((\t62 . (\t63 . (t26 (t12 t11)))) t21)) ((\t61 . t11) t25)) t50))))) (\t58
 . ((\t59 . (((((\t60 . t53) t28) (t34 (\t60 . (\t61 . (t10 (\t62 . (t24 (\t63 .
 (((\t64 . t5) (t23 t4)) t9))))))))) (((t13 (\t60 . (((\t61 . (\t62 . ((t11 (\t6
3 . t4)) t62))) t43) (((\t61 . (t27 t40)) ((\t61 . t46) t19)) t23)))) t2) ((t6 t
22) ((((\t60 . ((((\t61 . (\t62 . t11)) (t28 (((t2 (\t61 . t59)) (\t61 . (((\t62
 . ((\t63 . t61) t14)) t42) (\t62 . t0)))) (\t61 . ((t25 (t42 t23)) t50))))) (\t
61 . t24)) (\t61 . ((\t62 . t19) ((\t62 . t35) t61))))) (((\t60 . ((\t61 . t49) 
((\t61 . t21) (\t61 . (\t62 . (\t63 . (t14 (\t64 . t37)))))))) t24) (((\t60 . ((
(t42 t22) (\t61 . t58)) (t5 (((t57 t52) (t31 (\t61 . (\t62 . t29)))) (\t61 . (\t
62 . ((t11 (\t63 . t40)) (t31 (t9 ((\t63 . (((((t50 ((((\t64 . (\t65 . (\t66 . t
37))) (\t64 . (\t65 . t26))) ((\t64 . (t60 (((\t65 . (\t66 . (\t67 . t27))) t48)
 ((((t37 ((\t65 . ((\t66 . t40) (\t66 . (((((((\t67 . (t4 (((\t68 . ((t4 t10) t1
)) t27) (\t68 . ((\t69 . (\t70 . t16)) (\t69 . (\t70 . t29))))))) ((t34 t20) (t3
7 (\t67 . (t20 t45))))) ((((t48 t15) t27) (t3 t28)) t15)) (t13 t11)) t46) ((t39 
t4) t26)) (\t67 . ((\t68 . t58) ((\t68 . (t44 t19)) t31))))))) ((((\t65 . (t50 (
t5 t51))) (\t65 . (\t66 . ((((t1 (\t67 . (((\t68 . (\t69 . ((\t70 . t23) t25))) 
(\t68 . t63)) (\t68 . (\t69 . (\t70 . ((((((\t71 . (\t72 . (\t73 . ((\t74 . ((t1
9 ((t50 ((\t75 . t0) (\t75 . (\t76 . t74)))) t51)) (\t75 . t71))) (\t74 . t12)))
)) t33) t56) ((\t71 . (((t55 (\t72 . t67)) (\t72 . t18)) (\t72 . t53))) t62)) (\
t71 . t36)) ((\t71 . t57) t19)))))))) (\t67 . (t57 ((t39 (\t68 . (\t69 . ((t38 (
(t53 (t39 (\t70 . ((\t71 . t51) ((t2 ((\t71 . (\t72 . (\t73 . (\t74 . (t30 t30))
))) t62)) (\t71 . ((\t72 . t11) t34))))))) (((\t70 . ((t39 t70) (\t71 . (((\t72 
. t53) t5) (t19 t0))))) (t10 (\t70 . (t6 t67)))) (t19 t7)))) (\t70 . ((\t71 . ((
(\t72 . (\t73 . t43)) t59) ((\t72 . t27) (\t72 . ((\t73 . (\t74 . (t30 (((\t75 .
 (((((\t76 . ((\t77 . t67) ((t60 t15) (\t77 . t51)))) (\t76 . (\t77 . (\t78 . t6
1)))) (\t76 . (\t77 . t52))) (t31 t1)) t21)) t8) (\t75 . ((((t32 t50) (\t76 . (t
66 (t46 (t63 (t15 t11)))))) (\t76 . ((\t77 . t59) ((t27 t36) t64)))) t25)))))) (
\t73 . (t38 t72))))))) t10)))))) t9)))) t47) (t56 t1))))) (((((t49 ((t13 t29) t2
9)) ((\t65 . t46) t16)) t10) (t56 (\t65 . (\t66 . t37)))) t10)) t35))) ((((\t65 
. (\t66 . t24)) (\t65 . t2)) t61) t28)) (\t65 . ((\t66 . ((((\t67 . (\t68 . t45)
) t45) (\t67 . (\t68 . t65))) (\t67 . ((\t68 . t30) (\t68 . ((\t69 . (t43 (\t70 
. ((\t71 . ((\t72 . (\t73 . t48)) t16)) (t69 t35))))) t16)))))) (((((\t66 . t56)
 t60) (t36 (t14 (((\t66 . t11) t33) ((t9 (\t66 . t5)) (\t66 . t62)))))) t53) (\t
66 . ((\t67 . ((\t68 . ((\t69 . ((t24 (\t70 . t29)) (t44 t65))) (\t69 . (((t22 (
\t70 . (t36 (\t71 . (((t41 (((\t72 . t35) ((t45 (\t72 . t28)) ((\t72 . t45) (\t7
2 . t13)))) t47)) (\t72 . (\t73 . t28))) ((t33 (t51 t55)) (\t72 . t6))))))) (\t7
0 . (((t12 (\t71 . ((\t72 . ((((t13 t39) ((\t73 . (\t74 . t7)) ((\t73 . t73) (t1
9 (t9 t22))))) t56) (\t73 . t16))) (t41 (t17 ((\t72 . ((\t73 . (((t41 (\t74 . ((
t42 (\t75 . (\t76 . t39))) (t63 (\t75 . t35))))) t59) (\t74 . (t10 t25)))) ((t29
 (\t73 . (\t74 . (\t75 . (\t76 . ((\t77 . t57) (\t77 . (\t78 . (\t79 . t32))))))
))) (t57 (\t73 . (\t74 . (((\t75 . t66) (\t75 . ((\t76 . t32) ((((t25 t50) (\t76
 . t52)) t60) ((\t76 . t15) (\t76 . (((t2 (\t77 . t49)) t18) t21))))))) (\t75 . 
(((t43 t48) (t51 (\t76 . t60))) (\t76 . t66)))))))))) (t30 t8))))))) (t65 t23)) 
(t6 t13)))) (\t70 . t8))))) ((\t68 . t51) (t17 (\t68 . (\t69 . (t19 t4))))))) ((
\t67 . ((t16 (t0 (\t68 . t39))) t6)) (\t67 . t37)))))))) ((t23 t2) (t21 (t50 t9)
)))))) t9)) (\t64 . (\t65 . (\t66 . t39))))) (\t64 . (\t65 . t2))) ((\t64 . (\t6
5 . ((\t66 . ((t40 (t18 t39)) (\t67 . (\t68 . (\t69 . (t68 ((t54 t55) (\t70 . (\
t71 . (\t72 . ((\t73 . t73) (\t73 . ((\t74 . t52) t67))))))))))))) ((t64 (\t66 .
 (t9 (\t67 . t15)))) (t9 t65))))) t15)) t5) t6)) t3)))))))))) (\t60 . t37)) t53)
)) t59) t11)))) t37)) ((t44 t25) t55))))))) (\t56 . t14)))) ((\t55 . ((t52 (t51 
(\t56 . (\t57 . t48)))) (((\t56 . (t48 (((t42 (\t57 . t34)) ((t51 t8) ((t20 t24)
 t8))) ((t31 t14) t54)))) t44) t48))) (\t55 . t29))))) (t31 (\t54 . (t21 (((((\t
55 . ((t4 (((((\t56 . (((((\t57 . (\t58 . (\t59 . t49))) (\t57 . t5)) t51) t3) (
(\t57 . t47) (\t57 . ((\t58 . t8) ((((t44 t29) (t44 ((t16 (((((\t58 . (\t59 . (\
t60 . ((t7 t39) (t38 t15))))) (\t58 . t46)) (((\t58 . t52) ((\t58 . t39) ((t42 (
\t58 . (t51 (\t59 . (t12 t46))))) t53))) t52)) (\t58 . t29)) t39)) t13))) (\t58 
. t31)) t30)))))) (t6 (\t56 . (\t57 . t35)))) ((((\t56 . (t50 t31)) t50) t4) (((
t46 t9) (\t56 . (t44 t26))) t44))) ((\t56 . t55) (\t56 . (((t36 (\t57 . t11)) (\
t57 . (\t58 . t20))) (\t57 . t55))))) (\t56 . t27))) (\t56 . ((((((\t57 . (\t58 
. (((t12 t55) ((t32 (\t59 . ((\t60 . (t12 ((t8 (\t61 . t33)) (((\t61 . (t11 t2))
 t60) t38)))) t8))) ((\t59 . (t32 t35)) (\t59 . t39)))) t56))) (t53 t14)) (t21 (
(((t49 (\t57 . (\t58 . (\t59 . ((\t60 . (t43 (t15 (\t61 . (t30 t34))))) (\t60 . 
t59)))))) (t56 ((t18 (t56 (\t57 . t0))) t30))) (\t57 . (\t58 . (\t59 . (\t60 . (
(\t61 . t57) (\t61 . (((\t62 . t35) t39) ((\t62 . t25) ((t2 (\t62 . (((\t63 . ((
((\t64 . t47) ((t1 (\t64 . (t37 t5))) ((\t64 . (\t65 . (t45 (t35 (\t66 . (\t67 .
 ((\t68 . (t18 (t32 (t50 (\t69 . (t47 (\t70 . (\t71 . t58)))))))) (\t68 . (\t69 
. (\t70 . (\t71 . (\t72 . (\t73 . ((\t74 . (t40 ((t62 (\t75 . (\t76 . ((((t45 (\
t77 . (\t78 . ((t35 (t19 t34)) t8)))) t23) (t45 (((\t77 . t13) (t23 t24)) t32)))
 (\t77 . ((\t78 . t8) (\t78 . ((((\t79 . ((t56 t47) t77)) t35) (\t79 . (t55 (\t8
0 . (((\t81 . ((t16 (t58 t71)) t30)) t64) t33))))) ((((\t79 . (t21 t22)) (t63 ((
(\t79 . (t55 t56)) t13) (\t79 . t58)))) (t29 t18)) (\t79 . ((\t80 . (t18 ((\t81 
. (((t11 t18) t38) (\t82 . (t65 (t37 (\t83 . (\t84 . t79))))))) (t4 (t76 (\t81 .
 (t21 t45))))))) t6))))))))))) ((t51 (\t75 . (\t76 . ((((t57 t62) ((t65 (\t77 . 
(\t78 . (\t79 . t19)))) (\t77 . t43))) (((\t77 . t27) ((\t77 . (t14 (\t78 . t42)
)) (\t77 . ((t37 (\t78 . ((\t79 . t67) (t41 t11)))) t17)))) t62)) (\t77 . (\t78 
. (t20 (\t79 . t45)))))))) t63)))) (\t74 . t67))))))))))))))) (\t64 . (t49 (t34 
(t4 t10))))))) t9) t10)) t48) (\t63 . t53)))) (\t62 . (((\t63 . (\t64 . (\t65 . 
(t15 t40)))) t34) (\t63 . (t58 t55)))))))))))))) t48))) ((\t57 . ((((t39 ((\t58 
. (\t59 . t33)) ((t34 (\t58 . t58)) t6))) ((t3 ((t16 ((t16 t54) t16)) t0)) (((\t
58 . (\t59 . (t20 (\t60 . (((t35 ((\t61 . t41) (t52 t22))) (\t61 . (t53 ((\t62 .
 (\t63 . (t62 (t7 (\t64 . t53))))) (\t62 . t10))))) t46))))) ((\t58 . ((\t59 . t
44) (t27 t15))) t48)) t42))) ((((\t58 . (t8 (t16 t26))) t49) (t16 t52)) t36)) t3
3)) t36)) t55) (\t57 . ((\t58 . t22) ((\t58 . ((\t59 . t25) (\t59 . (\t60 . (t37
 (\t61 . (\t62 . (t11 (t38 (((\t63 . (\t64 . t16)) t33) t45)))))))))) ((\t58 . (
(\t59 . (\t60 . t60)) t20)) t1)))))))) (\t55 . ((t41 t31) (((t35 t20) (t28 ((\t5
6 . t35) t36))) (t30 t31))))) (\t55 . t50)) (\t55 . (t1 (\t56 . (\t57 . (\t58 . 
(t18 t51))))))) t20))))) ((\t54 . t32) t12)) (\t54 . t49)))) t51))))) t29))) (t1
7 t42)))) t10) (\t46 . (t19 (t18 (\t47 . (t42 (\t48 . (\t49 . ((\t50 . (\t51 . t
33)) t7))))))))) t2))) (\t45 . (\t46 . (((((\t47 . t4) ((((\t47 . (\t48 . ((((t4
3 ((\t49 . (((\t50 . ((((t15 ((\t51 . t20) t16)) (\t51 . (t0 t45))) ((t33 (((t8 
(\t51 . t6)) t21) t33)) t16)) (t25 (t38 (\t51 . t18))))) ((\t50 . t22) ((\t50 . 
t28) t45))) t18)) t47)) (t14 (\t49 . (\t50 . t24)))) (\t49 . (\t50 . t39))) ((((
t15 (\t49 . t11)) t31) t39) t40)))) t5) (\t47 . (\t48 . t40))) ((\t47 . (\t48 . 
(((((\t49 . (t8 t15)) (\t49 . ((\t50 . ((((\t51 . t5) t6) (((\t51 . (\t52 . t4))
 (\t51 . t11)) (t21 (\t51 . (\t52 . t18))))) ((\t51 . ((\t52 . (\t53 . (\t54 . t
30))) t1)) t28))) (\t50 . (t4 (((t38 ((\t51 . t42) t35)) (\t51 . (\t52 . (t48 (\
t53 . (\t54 . (t35 ((t6 t6) t53)))))))) (((t38 ((t28 (((\t51 . (t45 t36)) t35) t
45)) (t22 (((t41 t44) (\t51 . t41)) (\t51 . ((\t52 . (t46 t5)) t40)))))) (\t51 .
 ((\t52 . ((t7 t37) (\t53 . (((t20 t49) (((\t54 . ((\t55 . t2) t29)) (((t24 (t31
 (\t54 . t34))) ((t52 (\t54 . (t23 t25))) (\t54 . t36))) ((\t54 . (\t55 . t48)) 
((\t54 . t28) (\t54 . (\t55 . (t26 t51))))))) (\t54 . ((t35 t22) (\t55 . t8)))))
 t3)))) (\t52 . t26)))) t13))))))) ((((\t49 . t41) t41) ((t35 t2) t46)) t36)) (\
t49 . (t28 t24))) t0))) t19))) (\t47 . t43)) (\t47 . t3)) ((t35 t20) (\t47 . (\t
48 . ((((t21 (t41 t33)) t19) ((\t49 . t2) (t47 (\t49 . t49)))) (\t49 . t47))))))
))))) t16) t25))))) (\t43 . t40))))))))) (t21 t20)))) t36) (\t40 . t6))) (t2 t14
)))) t20) t7) (t15 t3)) (\t37 . (((t6 t26) t12) ((t28 (\t38 . (t38 (\t39 . ((\t4
0 . ((\t41 . ((t39 ((\t42 . (\t43 . (\t44 . (\t45 . t38)))) t1)) t13)) (\t41 . (
(\t42 . ((\t43 . (t17 (\t44 . (((((\t45 . ((\t46 . t44) t40)) ((\t45 . t44) (\t4
5 . t32))) t15) t0) ((\t45 . t38) t32))))) (t22 (\t43 . ((t21 (\t44 . (\t45 . (t
27 (\t46 . t28))))) (\t44 . (\t45 . t43))))))) ((\t42 . (t42 t13)) (\t42 . ((t20
 (((t10 (\t43 . ((\t44 . (t4 (t11 ((t26 (t9 t11)) (\t45 . (\t46 . (((t43 ((t0 (\
t47 . t39)) (t2 (t25 (\t47 . ((t7 ((\t48 . (\t49 . t41)) t16)) t31)))))) t3) ((t
22 t14) (((\t47 . (\t48 . (\t49 . ((\t50 . (t10 (((\t51 . (t21 ((\t52 . (t21 (t3
1 t42))) ((\t52 . (t26 t5)) t38)))) (((\t51 . ((\t52 . t38) (((t37 (t47 (t13 ((\
t52 . (\t53 . t6)) (\t52 . (\t53 . (\t54 . (t43 ((((\t55 . (\t56 . (t29 (t15 (\t
57 . t0))))) ((t2 (t1 (t38 (\t55 . (\t56 . (\t57 . (\t58 . ((\t59 . ((\t60 . (\t
61 . t55)) ((\t60 . (t59 (((\t61 . t12) t25) ((\t61 . t43) ((t0 (\t61 . (t15 t17
))) t54))))) (\t60 . t28)))) ((\t59 . ((t41 (((t38 t39) t58) t18)) (((((\t60 . (
t57 (t10 (t17 (\t61 . (\t62 . t10)))))) (\t60 . t54)) t26) t57) ((((t19 (\t60 . 
t58)) (\t60 . (((\t61 . t31) (t43 ((((t28 t55) (((\t61 . ((t46 (\t62 . (\t63 . (
(\t64 . (\t65 . t10)) t52)))) (\t62 . t20))) t57) (\t61 . (\t62 . ((\t63 . (((\t
64 . t49) t17) ((t8 (((\t64 . (t34 t41)) (\t64 . (\t65 . (\t66 . (\t67 . (\t68 .
 (((\t69 . (\t70 . (t27 (\t71 . ((t55 (t16 t32)) t48))))) t25) t65))))))) (((\t6
4 . (((\t65 . (\t66 . ((\t67 . (\t68 . (\t69 . (\t70 . (t10 t47))))) (\t67 . t66
)))) t36) (t18 (\t65 . (\t66 . t54))))) t37) t17))) t50))) (((\t63 . (t14 (\t64 
. t59))) t32) (\t63 . (\t64 . (\t65 . t9))))))))) t39) (((\t61 . (t17 ((t3 (t35 
t20)) (\t62 . (\t63 . t62))))) t18) t30)))) (\t61 . ((t45 t1) ((t19 ((\t62 . (((
(t58 t52) ((\t63 . (\t64 . ((((\t65 . (t49 t33)) (\t65 . (t48 (((\t66 . (\t67 . 
t7)) (t31 t60)) ((\t66 . ((\t67 . (t20 t64)) (t25 t34))) (t24 (\t66 . ((\t67 . t
56) (\t67 . t53))))))))) ((((\t65 . (\t66 . (\t67 . t40))) (t56 (\t65 . (((((t19
 t49) ((\t66 . (((((t36 (\t67 . (t55 (\t68 . (\t69 . ((t29 ((\t70 . t62) t27)) (
\t70 . ((t24 t41) (\t71 . t30))))))))) ((t13 t10) (\t67 . ((\t68 . (\t69 . (\t70
 . t26))) (t19 (((t15 ((\t68 . (t39 (t0 t68))) (t11 ((\t68 . ((t18 t66) (\t69 . 
t35))) t29)))) t35) t67)))))) t38) ((\t67 . t38) t36)) ((\t67 . (\t68 . (\t69 . 
(\t70 . t46)))) t65))) (\t66 . t46))) t43) ((\t66 . t53) (\t66 . (\t67 . t60))))
 t44)))) (((\t65 . (\t66 . (\t67 . t61))) t8) ((t36 t50) (((\t65 . t32) ((\t65 .
 t10) t9)) t6)))) (\t65 . (((\t66 . t47) ((\t66 . (\t67 . (\t68 . (t11 ((((\t69 
. (\t70 . (t22 (\t71 . (\t72 . t9))))) ((\t69 . t30) t27)) t55) ((t59 t68) (\t69
 . (\t70 . ((\t71 . (\t72 . (\t73 . (\t74 . t68)))) (\t71 . t2)))))))))) t23)) (
((\t66 . t63) (\t66 . ((\t67 . (\t68 . (t65 (t30 t13)))) (\t67 . t31)))) (t15 (t
50 t19))))))) (t62 (\t65 . ((\t66 . t7) t44)))))) t26)) ((\t63 . (t15 (((\t64 . 
t2) (\t64 . (((\t65 . t5) t36) t56))) (((t53 t58) (\t64 . (((t51 (t46 t14)) ((\t
65 . ((((t59 (\t66 . (t60 (((t22 t18) (\t67 . t15)) (t1 t28))))) ((\t66 . t7) ((
(t38 t52) (t16 (t57 t38))) ((\t66 . t35) t6)))) (t22 (((\t66 . (\t67 . t28)) t23
) (\t66 . (\t67 . (t27 (\t68 . (\t69 . t26)))))))) (((\t66 . ((t55 t10) ((\t67 .
 (\t68 . t4)) t60))) ((\t66 . (\t67 . (\t68 . ((((\t69 . (\t70 . (\t71 . (\t72 .
 t40)))) t52) (\t69 . (((t61 (t3 ((t57 ((\t70 . (\t71 . t57)) ((\t70 . (\t71 . (
(t64 t52) t69))) (\t70 . (\t71 . t69))))) t9))) (\t70 . t31)) ((\t70 . t9) (\t70
 . (t2 (t35 (t69 (\t71 . (\t72 . (\t73 . ((\t74 . ((\t75 . t28) (t27 ((\t75 . (\
t76 . (\t77 . (t53 t60)))) t25)))) (\t74 . ((\t75 . t15) (\t75 . (\t76 . (((\t77
 . ((\t78 . t48) (((t25 (t34 (((t72 t47) (\t78 . t9)) (((t71 ((\t78 . (t17 t22))
 t46)) (t59 (\t78 . (((t7 ((((t69 (\t79 . t64)) (\t79 . ((\t80 . (t37 (\t81 . t4
5))) (\t80 . (\t81 . t14))))) t10) (\t79 . (\t80 . t22)))) t72) t44)))) t7)))) t
4) (t13 ((t18 (t8 t24)) ((\t78 . (((((((\t79 . t72) (\t79 . t40)) (t3 (t17 ((t28
 ((t77 (\t79 . (t73 t10))) (t58 (((\t79 . ((\t80 . t34) ((((((t32 t45) t19) t37)
 t78) t69) t72))) (\t79 . t64)) t68)))) t70)))) t13) ((t60 t41) (\t79 . t35))) (
t35 (\t79 . (t70 ((\t80 . ((\t81 . (t54 t35)) t11)) ((t75 ((\t80 . (t25 (t4 (((\
t81 . ((\t82 . t32) (((t22 t66) (((t16 ((\t82 . (((t6 t61) t18) t63)) (\t82 . (t
28 (\t83 . t80))))) t60) t35)) t1))) t3) (t7 t17))))) (((\t80 . t38) ((\t80 . (t
79 t15)) ((\t80 . ((\t81 . (\t82 . t72)) (t26 (\t81 . (\t82 . (((t4 t8) (t63 ((\
t83 . t19) (\t83 . (t7 (\t84 . (t58 (\t85 . (t13 t8))))))))) (\t83 . (\t84 . t61
)))))))) (\t80 . (\t81 . t35))))) (t0 t39)))) t7)))))) t32)) (((t2 ((\t78 . (\t7
9 . t77)) t22)) (t40 ((\t78 . t10) t39))) (((\t78 . t8) t35) (\t78 . (t77 t51)))
))))))) ((\t77 . t59) t71)) t54))))))))))))))))) (\t69 . (\t70 . (((t63 t6) (((t
59 (\t71 . (\t72 . (t60 (t69 (\t73 . ((\t74 . t27) t16))))))) t12) (\t71 . t17))
) t9))))))) t56)) t29))) t61)) (\t65 . (t45 ((\t66 . (\t67 . (\t68 . t7))) t20))
)))) (t42 t3))))) t27)) (\t63 . t52))) (\t62 . (t38 (((t48 t6) t20) ((((\t63 . t
43) t32) (((t46 (t61 t54)) (\t63 . (\t64 . t29))) t43)) t48)))))) (t40 ((\t62 . 
t52) ((t20 (\t62 . ((t60 t5) (t43 t1)))) t17))))))))) (((\t60 . t45) (\t60 . t45
)) t25)) (\t60 . t3))))) (\t59 . (\t60 . (\t61 . (((\t62 . t16) t20) t31))))))))
))))) (\t55 . (((\t56 . t43) t25) (\t56 . (((\t57 . t30) t19) ((t20 (t26 (t10 ((
\t57 . t50) (\t57 . (t14 (t54 ((t57 (\t58 . t54)) t10)))))))) t53))))))) t31) t4
4))))))))) ((\t52 . t16) ((((\t52 . (t43 (\t53 . (t41 (\t54 . ((\t55 . t20) (\t5
5 . t42))))))) (\t52 . t28)) t31) ((t27 (t40 t44)) (\t52 . (t39 (((t21 (\t53 . t
32)) t4) t52))))))) (\t52 . (\t53 . (\t54 . ((((t13 (((t6 t9) (t36 (t20 t21))) (
(((\t55 . (\t56 . ((\t57 . ((((\t58 . ((\t59 . (\t60 . (t46 (t25 (\t61 . (t46 ((
\t62 . (\t63 . ((\t64 . (\t65 . t60)) (\t64 . (t53 (\t65 . t6)))))) t16))))))) (
t7 t12))) (t8 t24)) ((\t58 . (((\t59 . t10) (\t59 . t42)) t6)) ((t10 (\t58 . (t3
9 t4))) t54))) t8)) t27))) t30) (\t55 . (\t56 . (\t57 . t5)))) t26))) ((\t55 . (
(t53 (((t46 t29) (t33 t40)) t35)) (t25 t19))) (\t55 . (\t56 . (\t57 . (t27 ((\t5
8 . t19) (\t58 . ((\t59 . ((t0 ((\t60 . (t31 t31)) (\t60 . t45))) t59)) (t17 (\t
59 . (\t60 . (\t61 . t1))))))))))))) (((t26 (\t55 . t17)) (\t55 . t24)) t40)) t5
0))))))) (\t51 . t10)) ((((\t51 . (\t52 . ((((\t53 . t36) t26) ((\t53 . (t35 ((\
t54 . (t32 t47)) t28))) (\t53 . (\t54 . t37)))) t9))) (t8 ((t15 (\t51 . t20)) t1
0))) ((\t51 . (\t52 . (\t53 . (\t54 . ((\t55 . (\t56 . (((\t57 . (\t58 . t21)) (
t55 t26)) (\t57 . t32)))) (((\t55 . t44) (\t55 . t25)) (((((t7 t54) ((((((\t55 .
 t47) ((\t55 . (((\t56 . (t14 t19)) ((t26 (\t56 . (t12 (t45 ((\t57 . t31) (\t57 
. (((((\t58 . t9) (\t58 . t40)) ((\t58 . t50) (t14 t42))) t4) (t56 t3)))))))) ((
\t56 . (t16 t41)) t25))) t27)) t29)) (t37 (t41 t49))) t27) ((\t55 . (t29 t7)) (\
t55 . ((\t56 . (\t57 . (\t58 . (\t59 . (t19 ((\t60 . (t28 ((t23 t15) ((\t61 . ((
\t62 . t31) t17)) t34)))) t59)))))) (t21 t53))))) (((\t55 . t44) (\t55 . t13)) (
t42 t21)))) (\t55 . t3)) (\t55 . (\t56 . t17))) t21))))))) (\t51 . (\t52 . (t35 
(\t53 . (t4 (\t54 . (\t55 . t5))))))))) (t24 (((\t51 . (((\t52 . t14) t26) ((\t5
2 . t18) t6))) t29) t34))))) ((t0 (t23 (\t51 . (\t52 . t4)))) (\t51 . (t11 t38))
)))) ((t8 (((\t50 . (\t51 . t10)) ((\t50 . (t9 t11)) (\t50 . (((\t51 . t8) (\t51
 . (t33 ((\t52 . (\t53 . (t9 t34))) (t43 (\t52 . ((\t53 . t27) ((\t53 . ((\t54 .
 (t41 t6)) t48)) (t50 (\t53 . (t48 ((\t54 . t5) t18)))))))))))) (t23 t49))))) t2
8)) ((\t50 . t17) (\t50 . (\t51 . t16)))))))) (t41 (\t47 . t19))) (((t34 (t44 (\
t47 . (\t48 . (\t49 . t18))))) (t32 t31)) (t17 t28))))))))))) (\t44 . t4)))) (((
(((t29 (\t43 . t39)) (\t43 . (\t44 . ((\t45 . ((\t46 . t1) (t7 (t13 (t38 t23))))
) t42)))) t25) (\t43 . t2)) t26) (\t43 . t40))) t36)) t34))))))) ((\t40 . t13) (
(t14 (\t40 . (\t41 . t34))) t6))))))) (t37 t5)))))) t28) t4))) t25))) (t29 ((\t3
6 . (t27 (\t37 . (\t38 . (t4 t13))))) (t35 ((\t36 . ((\t37 . (((\t38 . t8) t4) (
((\t38 . (\t39 . (\t40 . t24))) t28) (\t38 . t17)))) (\t37 . t8))) (\t36 . ((\t3
7 . (t14 (t36 (t2 ((\t38 . (\t39 . ((\t40 . t0) (t27 t38)))) (t2 t21)))))) (\t37
 . (\t38 . (\t39 . (\t40 . ((t17 (\t41 . (\t42 . ((\t43 . t29) t3)))) (t39 (((t1
6 t25) t34) t14)))))))))))))) t14))) t28) t23))))))))))) ((\t34 . ((\t35 . t35) 
t18)) (((t27 (t15 ((\t34 . t26) (\t34 . ((t14 t24) t32))))) t8) t11))))) (\t33 .
 t18))))) t12))))) t0)) (t18 ((\t30 . (\t31 . (t19 (((t21 (t27 t30)) t5) (\t32 .
 (\t33 . (\t34 . (\t35 . t11)))))))) (\t30 . (t2 t0))))) ((t12 t24) (\t30 . (\t3
1 . t1))))))) t21) (\t30 . (((\t31 . t5) ((\t31 . (((\t32 . (((t5 t13) t28) (t12
 ((t2 (\t33 . (\t34 . t5))) t13)))) (\t32 . (t14 t4))) (((((\t32 . (\t33 . (\t34
 . t6))) t23) t13) (\t32 . t30)) t6))) ((t19 (t5 t8)) (t29 ((\t31 . t7) t14)))))
 (t8 t1)))))) t1)))))) t20)) t24) (\t26 . (t15 t23))) ((\t26 . t1) (\t26 . ((((\
t27 . (\t28 . ((t1 (t15 t24)) (t6 ((t2 ((t0 (\t29 . (((t26 (\t30 . (\t31 . ((\t3
2 . (t24 t10)) (\t32 . t26))))) ((t29 t18) t18)) (\t30 . (\t31 . t14))))) (\t29 
. (\t30 . (\t31 . (t8 t12)))))) t3))))) t22) (t19 (\t27 . t11))) t14)))))) t24))
 t3) t15)) ((t13 t16) ((t21 t5) t0))) (t23 t7)) t24))) (((\t24 . t3) ((t18 (((((
(t11 ((((\t24 . t23) (((\t24 . t3) (\t24 . t16)) t0)) (\t24 . t1)) (\t24 . t11))
) t4) (\t24 . t15)) (\t24 . t18)) ((t1 (\t24 . (\t25 . t5))) t19)) (\t24 . t0)))
 t10)) t7))) (t13 t18)) t17))) (\t23 . (t4 t14))) (t13 t14))) t5) ((t19 (\t23 . 
(\t24 . (\t25 . t24)))) (((\t23 . (t0 t22)) t5) (\t23 . t9))))))) ((\t22 . (\t23
 . t0)) t6)))) t3) (t20 t19))))))) (((\t18 . t4) (t14 ((\t18 . t5) t17))) t16)))
 (\t18 . t13)) t5) ((\t18 . (\t19 . ((t12 t13) ((\t20 . t9) (\t20 . t9))))) ((((
(((((t16 (t16 (t14 (t12 t13)))) t9) (t13 t7)) (t4 t6)) ((\t18 . ((((\t19 . ((\t2
0 . t3) (\t20 . (\t21 . t9)))) ((t15 (t6 (t11 (t11 (t4 ((t11 (\t19 . t14)) t10))
)))) (\t19 . (\t20 . (\t21 . (\t22 . t10)))))) ((t14 (\t19 . (\t20 . (\t21 . t8)
))) (t4 (((\t19 . (\t20 . (\t21 . t5))) (\t19 . t17)) (t9 (t9 t14)))))) ((\t19 .
 t11) t17))) t10)) (\t18 . t15)) (((\t18 . ((\t19 . (t2 t7)) t18)) t17) (t17 t5)
)) t13) t16)))))) (((\t15 . t2) (t5 ((t6 (\t15 . t10)) ((((\t15 . ((\t16 . (t9 (
t13 t10))) ((\t16 . t14) ((\t16 . (\t17 . t3)) ((t3 t12) (\t16 . (t15 (t8 t5))))
)))) ((\t15 . (\t16 . (\t17 . t8))) t14)) (\t15 . (\t16 . (\t17 . ((t4 (\t18 . t
4)) t12))))) (t6 t3))))) t3))) t5) ((t5 t3) ((t3 (\t14 . t2)) (((\t14 . (t2 ((\t
15 . t3) t7))) (\t14 . t3)) (\t14 . t8))))) t0) ((\t14 . (\t15 . ((\t16 . (((\t1
7 . t11) t15) (\t17 . ((((\t18 . t4) t11) (((\t18 . (t18 ((((\t19 . t19) t13) (t
2 t9)) t2))) (\t18 . t5)) t2)) (\t18 . t4))))) t3))) (\t14 . ((\t15 . (\t16 . t7
)) t3)))) (\t14 . t2))) t2) (t10 (\t13 . (((\t14 . ((((\t15 . t2) ((t1 t6) (\t15
 . (\t16 . ((\t17 . (((\t18 . t5) (t3 t5)) ((\t18 . t13) ((\t18 . t0) (\t18 . (\
t19 . (((t17 (\t20 . (t6 (t5 t6)))) (\t20 . (\t21 . (t12 (\t22 . t22))))) (\t20 
. (t0 t1))))))))) t11))))) (\t15 . t2)) (t7 ((\t15 . (\t16 . ((t12 ((\t17 . t13)
 (\t17 . t9))) ((((t2 t7) t16) (((t12 (\t17 . t5)) (\t17 . ((\t18 . (((t14 (t8 (
t17 (\t19 . (t4 t3))))) (\t19 . t13)) t13)) (t10 t4)))) ((((\t17 . t10) t8) (t12
 (t15 (\t17 . t9)))) t6))) (t4 ((((\t17 . ((((((t7 (\t18 . t10)) (t2 ((((\t18 . 
(t12 (t13 ((t13 ((t11 (\t19 . (\t20 . (((t7 t8) (t14 (\t21 . t4))) t3)))) (\t19 
. t11))) t9)))) t4) (t2 ((t12 t13) t7))) t8))) t14) (\t18 . (((\t19 . (\t20 . (\
t21 . (\t22 . (\t23 . t14))))) t7) t5))) (\t18 . ((\t19 . (\t20 . (\t21 . t3))) 
(\t19 . (t5 (\t20 . (\t21 . t10))))))) ((\t18 . (\t19 . t13)) (\t18 . ((t6 t4) (
\t19 . t6)))))) (\t17 . t17)) t10) ((\t17 . t5) (\t17 . ((t6 t10) (\t18 . (\t19 
. (\t20 . t5)))))))))))) (t5 ((\t15 . (t4 (\t16 . (\t17 . ((t9 (t8 ((\t18 . (t4 
(t12 (t7 t14)))) (\t18 . (\t19 . t1))))) ((\t18 . ((t13 (t5 (\t19 . (\t20 . ((t2
0 (t11 (\t21 . t20))) t2))))) (\t19 . (t6 (t1 t5))))) (\t18 . ((\t19 . t2) (\t19
 . t11))))))))) t0)))))) (((((\t14 . (\t15 . t4)) t13) (((\t14 . (\t15 . (\t16 .
 t8))) (\t14 . t13)) (((\t14 . (\t15 . (((\t16 . t6) (\t16 . t5)) (t13 t8)))) (\
t14 . t0)) ((\t14 . ((\t15 . (t13 (t2 (t14 ((\t16 . ((\t17 . t8) (\t17 . (\t18 .
 (t12 ((((t11 (((\t19 . t1) (\t19 . t16)) (t6 (\t19 . t6)))) ((\t19 . t15) (\t19
 . (t16 (\t20 . (\t21 . t0)))))) t18) t16)))))) t1))))) (t10 t13))) ((\t14 . t6)
 t9))))) t1) (\t14 . t12))) (((((t13 (t1 ((\t14 . (\t15 . t7)) (t0 t1)))) (t5 t1
)) (((t11 (\t14 . (\t15 . (\t16 . (\t17 . (\t18 . ((\t19 . t13) ((t6 t14) t17)))
))))) (t12 (\t14 . t7))) ((t6 t3) (\t14 . (\t15 . (t5 (\t16 . (\t17 . (\t18 . ((
\t19 . t17) t16)))))))))) ((\t14 . (t10 t7)) t9)) (\t14 . (\t15 . (\t16 . (t8 (t
3 (\t17 . ((t17 (\t18 . (\t19 . ((t18 (\t20 . t9)) t4)))) (\t18 . t7))))))))))))
))))) (\t10 . t4))) ((t0 t7) (t5 ((\t10 . (t8 t2)) ((\t10 . t4) ((\t10 . t10) t9
)))))))))))) (t4 t1))) (\t6 . (\t7 . t3)))))) ((((t3 (\t5 . t1)) t4) (t4 (t1 t0)
)) t1)) (t4 ((t0 t2) t1))) (t3 (((\t5 . (\t6 . (\t7 . (\t8 . (\t9 . t5))))) ((\t
5 . t2) (\t5 . (t3 t5)))) ((\t5 . (\t6 . t4)) (\t5 . t5))))) (\t5 . (t4 (\t6 . t
2)))) (\t5 . t3))) ((\t4 . t2) ((\t4 . (t1 (\t5 . ((t5 (\t6 . t5)) ((\t6 . (\t7 
. ((\t8 . ((\t9 . (\t10 . ((t5 t4) (t7 ((\t11 . ((\t12 . (\t13 . (((t0 (((((\t14
 . t6) (\t14 . (((t0 t9) t8) ((t7 ((((\t15 . t1) t12) ((t14 (((((t3 (\t15 . (\t1
6 . (t14 (\t17 . (\t18 . ((\t19 . (t12 t10)) (((((\t19 . (\t20 . ((\t21 . ((\t22
 . (\t23 . (\t24 . (\t25 . t18)))) (t7 t21))) t6))) t1) ((\t19 . t8) (\t19 . (\t
20 . (\t21 . ((\t22 . ((((((\t23 . (((t6 t16) (t2 (t13 t5))) (\t24 . t19))) (t16
 (((\t23 . t4) (\t23 . (\t24 . (((\t25 . (\t26 . t1)) (((t24 t21) t0) (\t25 . (t
15 (\t26 . (t19 (\t27 . t5))))))) (((((t7 (t6 (\t25 . t23))) t15) (\t25 . (\t26 
. ((t14 (\t27 . t18)) (t8 t13))))) (((((\t25 . ((((t4 t4) ((t11 (\t26 . (\t27 . 
((\t28 . t3) (\t28 . ((\t29 . (t2 ((\t30 . (\t31 . (t27 ((\t32 . t31) t29)))) (t
8 ((\t30 . t24) t5))))) ((\t29 . (\t30 . (\t31 . t3))) t15))))))) (\t26 . t2))) 
(\t26 . ((((((\t27 . ((t21 (\t28 . (\t29 . ((t18 (\t30 . (\t31 . (t15 t7)))) (\t
30 . (\t31 . (\t32 . t23))))))) t4)) (t5 t16)) t0) t21) ((\t27 . (\t28 . ((\t29 
. t17) (t25 t10)))) t24)) t9))) t25)) (t14 (t3 ((t21 (\t25 . (\t26 . ((\t27 . (\
t28 . (((((((t15 t27) (((t15 t15) ((\t29 . ((\t30 . t18) t14)) (((\t29 . (t6 t16
)) (t23 t10)) t16))) t18)) (\t29 . (\t30 . t18))) (t19 (\t29 . (\t30 . ((t0 t3) 
t27))))) (\t29 . t21)) ((t21 ((t0 (t4 (\t29 . (((\t30 . (\t31 . t26)) t28) t14))
)) ((((\t29 . (t8 (\t30 . t21))) (\t29 . (t2 ((\t30 . t26) t15)))) (\t29 . (\t30
 . (t16 (\t31 . t27))))) (t25 t4)))) t7)) (\t29 . t1)))) t1)))) t7)))) (\t25 . t
18)) (t3 t7)) ((\t25 . t17) (t1 (t13 (\t25 . (\t26 . (t13 (t17 (t2 t23))))))))))
 t17))))) ((\t23 . t21) ((\t23 . t13) ((t4 t11) t3)))))) t10) (\t23 . ((t5 (((t1
 (\t24 . (t11 (\t25 . (\t26 . (t12 t15)))))) (t18 t11)) t15)) t16))) t8) (t1 ((\
t23 . t8) t16)))) (\t22 . (((t14 (t7 t18)) t10) t13)))))))) t14) (((((\t19 . ((\
t20 . (\t21 . (\t22 . ((\t23 . t19) (t11 (\t23 . t19)))))) (t4 (\t20 . (\t21 . (
\t22 . t4)))))) ((t6 (((\t19 . t19) (t3 (t3 (\t19 . t7)))) (\t19 . ((t7 ((t18 t1
5) (t15 (t16 (\t20 . (((t19 ((t3 t2) (t10 (((t11 ((\t21 . t9) t1)) ((\t21 . t21)
 ((((t4 (t17 ((t15 (t19 (t1 t8))) t2))) (\t21 . t16)) t7) t18))) (\t21 . (\t22 .
 (\t23 . (\t24 . t3)))))))) t5) (\t21 . (\t22 . (((t22 (((\t23 . (\t24 . t22)) (
t22 ((\t23 . ((\t24 . ((\t25 . t18) (t24 ((((\t25 . (\t26 . ((t18 t13) (t8 t7)))
) t24) t14) (\t25 . (\t26 . (t13 (\t27 . t1)))))))) (\t24 . (\t25 . t24)))) (t5 
(t22 t6))))) (t10 t2))) (((t16 (\t23 . (\t24 . t0))) (\t23 . ((((t20 ((((\t24 . 
((\t25 . (\t26 . t6)) (\t25 . ((t24 t25) (\t26 . ((\t27 . (\t28 . (\t29 . (\t30 
. ((\t31 . (\t32 . t17)) ((((\t31 . t15) t28) (\t31 . (((t22 ((t21 (t18 t31)) (\
t32 . t3))) (((t16 (\t32 . t17)) (t25 t21)) ((\t32 . t31) ((((\t32 . t4) t4) t9)
 t22)))) ((t7 t11) (\t32 . t13))))) (\t31 . ((t3 (\t32 . t6)) t7)))))))) ((\t27 
. ((t24 t23) t0)) t6))))))) (((\t24 . t20) t1) ((t8 (t1 t1)) t8))) ((\t24 . t2) 
(\t24 . t10))) t0)) t8) t20) (\t24 . t16)))) t11)) ((((\t23 . t15) t14) (((\t23 
. (\t24 . (t18 (t20 t18)))) t5) t6)) (\t23 . (((\t24 . (t1 t16)) ((\t24 . ((\t25
 . t1) t21)) (t18 t5))) ((\t24 . ((t17 t24) (\t25 . ((\t26 . (\t27 . (\t28 . t24
))) ((t8 (((\t26 . t8) (\t26 . (t21 (\t27 . (((((t21 t4) (t11 t13)) (\t28 . ((\t
29 . (t25 (\t30 . (((\t31 . t5) t12) (t6 (t6 t21)))))) (t5 ((\t29 . t21) t2)))))
 (\t28 . (\t29 . (\t30 . (\t31 . t13))))) (\t28 . t16)))))) t20)) t13))))) (\t24
 . (t10 t13))))))))))))))) (t10 (\t20 . (t10 t6))))))) t16)) (((((\t19 . (\t20 .
 ((\t21 . ((\t22 . ((t0 (((\t23 . t2) (t19 t6)) t20)) t0)) (t14 (\t22 . (\t23 . 
(\t24 . (t1 (((\t25 . (\t26 . t18)) t23) t13)))))))) t3))) (((\t19 . (\t20 . ((\
t21 . t5) ((\t21 . t16) (\t21 . ((((\t22 . (t8 t12)) ((\t22 . (t12 (\t23 . t13))
) t2)) ((\t22 . t14) t16)) (\t22 . t6))))))) t18) (\t19 . (t6 ((\t20 . t7) t18))
))) (t10 t3)) t7) t12)) ((((t6 ((\t19 . (t19 ((t0 t1) t12))) (\t19 . t11))) t16)
 t13) ((\t19 . t14) (\t19 . (\t20 . t20))))) ((\t19 . ((\t20 . (\t21 . ((((\t22 
. ((\t23 . t12) (t17 t17))) ((t16 t2) t15)) (t16 (\t22 . (t1 t13)))) (((\t22 . (
\t23 . (\t24 . (((t22 (t17 (t10 ((t12 (t11 t19)) (t23 t16))))) ((\t25 . (\t26 . 
(\t27 . (\t28 . (t15 (\t29 . (\t30 . (t18 (t11 ((\t31 . (t29 t28)) (((t27 t11) (
\t31 . (t17 t27))) (((t15 (\t31 . t13)) (\t31 . (t29 (((t20 t21) (\t32 . (\t33 .
 (\t34 . (t23 t15))))) (\t32 . (\t33 . (\t34 . t17))))))) t11)))))))))))) (\t25 
. t1))) t16)))) (\t22 . (\t23 . (\t24 . (\t25 . t15))))) ((\t22 . t16) ((\t22 . 
t13) t13)))))) (t19 (\t20 . t0)))) (t7 ((t4 t10) (((\t19 . t19) (\t19 . (\t20 . 
t4))) t15))))))))))))) t13) (\t15 . t7)) (\t15 . t1)) (\t15 . t1))) t1)) t11)) (
\t15 . (\t16 . ((t2 (t13 ((\t17 . (\t18 . (\t19 . (t10 (\t20 . ((t6 t9) t1))))))
 (\t17 . t16)))) ((\t17 . t5) t2)))))))) ((\t14 . (\t15 . t5)) t7)) t1) ((\t14 .
 (t1 (t12 t2))) (\t14 . t8)))) (\t14 . (\t15 . (((((t13 (t10 t9)) t9) t3) (\t16 
. ((\t17 . t5) (((\t17 . (((\t18 . (t10 (((t18 (\t19 . (t6 ((\t20 . (((\t21 . ((
((\t22 . t21) (\t22 . t8)) t2) t6)) (t20 (\t21 . (\t22 . (((\t23 . (\t24 . ((t23
 t18) (\t25 . (((\t26 . (\t27 . (\t28 . t13))) (((\t26 . (((\t27 . t1) t23) ((t1
5 t18) t25))) t17) ((\t26 . ((\t27 . ((((\t28 . (\t29 . ((t15 (\t30 . (((t23 (\t
31 . t24)) (\t31 . ((((\t32 . (\t33 . t30)) ((\t32 . t21) (\t32 . t24))) ((((\t3
2 . (\t33 . t15)) (((t16 t8) t14) (\t32 . (\t33 . t19)))) t12) (\t32 . t0))) t2)
)) (t17 t0)))) ((((\t30 . (t7 (t12 t13))) t18) (((\t30 . (\t31 . (\t32 . (((t15 
(\t33 . t10)) (\t33 . (\t34 . (\t35 . (t2 (((t22 t23) t9) (\t36 . (t33 (\t37 . (
((\t38 . t19) (\t38 . (\t39 . t29))) (t3 (t4 (\t38 . (t5 t14)))))))))))))) (t7 (
t17 ((((\t33 . (t26 (\t34 . t18))) t21) t29) (\t33 . t9)))))))) (t4 t19)) ((\t30
 . t21) t27))) (\t30 . t6))))) (t23 (\t28 . ((\t29 . (\t30 . (\t31 . (t1 t4)))) 
(\t29 . (\t30 . (\t31 . ((\t32 . t3) t1)))))))) t10) (((\t28 . (((t10 (t16 (\t29
 . t28))) (\t29 . t4)) (t24 ((t18 ((\t29 . t14) t18)) (((\t29 . t14) ((\t29 . t1
1) ((\t29 . (t7 ((\t30 . (\t31 . (t11 t7))) (\t30 . t18)))) t11))) t0))))) (((\t
28 . t13) t24) t2)) (t0 t15)))) t4)) (t25 t10)))) (t8 t17)))))) (t11 t5)) t17)))
)) t7)) (t13 (t8 t6)))))) t12) t2))) ((t6 (t4 (\t18 . t15))) ((\t18 . (t16 t9)) 
(\t18 . (\t19 . (t6 t3)))))) (\t18 . t2))) t14) (((t1 (\t17 . ((\t18 . (t13 ((\t
19 . (((((t12 ((t15 t11) t2)) (t4 (t17 (t13 ((\t20 . t8) (((\t20 . (\t21 . (((\t
22 . ((t5 (t17 ((\t23 . (t18 (((((t20 (\t24 . (\t25 . ((\t26 . (t10 (t22 t24))) 
t24)))) t20) (\t24 . (\t25 . t1))) ((t6 (\t24 . (\t25 . (t19 (\t26 . (t21 (\t27 
. (\t28 . (\t29 . t29))))))))) (((\t24 . t0) (t3 (\t24 . (\t25 . t10)))) (\t24 .
 (((t22 (\t25 . t22)) (\t25 . t7)) t5))))) (\t24 . t7)))) (\t23 . (\t24 . t9))))
) t12)) t20) (((\t22 . (\t23 . (t10 ((\t24 . t3) (((\t24 . t21) t21) (\t24 . ((t
17 t0) t8))))))) (t12 (\t22 . (\t23 . (\t24 . (\t25 . (\t26 . (\t27 . (\t28 . t1
3))))))))) ((\t22 . t19) ((t19 (\t22 . ((t15 t6) (\t23 . t13)))) (((\t22 . t3) (
(t7 t17) t1)) t19))))))) t0) t0)))))) (\t20 . (\t21 . t10))) (\t20 . ((\t21 . (\
t22 . ((t20 t22) t15))) t12))) t7)) (t10 (t3 t17))))) t16))) t10) (t6 ((\t17 . t
12) t16))))))) (\t16 . (t15 ((t13 (\t17 . t14)) t7))))))) (\t14 . (t4 (t1 (\t15 
. ((\t16 . t14) t9)))))))) (\t12 . (t8 ((\t13 . (\t14 . t13)) t8))))) (((((\t11 
. ((((\t12 . (\t13 . (\t14 . (\t15 . (\t16 . (\t17 . (\t18 . (((t8 t15) (((\t19 
. (\t20 . (\t21 . t3))) t2) t2)) t17)))))))) (\t12 . t11)) (\t12 . t9)) t6)) (t5
 t8)) ((t10 t6) t10)) ((\t11 . t9) t3)) t1)))))) (\t9 . ((\t10 . t8) t3)))) t3))
) t2))))) (\t4 . (t3 ((\t5 . (((\t6 . ((\t7 . (\t8 . (((((t7 (\t9 . ((t9 (\t10 .
 (t1 (\t11 . t5)))) (\t10 . (t8 t3))))) (\t9 . ((((\t10 . ((((\t11 . ((\t12 . (\
t13 . ((t7 (\t14 . t3)) (\t14 . t8)))) t11)) t1) (t4 (\t11 . t5))) t6)) t5) t2) 
t7))) (((\t9 . (t6 t1)) (t5 (\t9 . (((\t10 . ((\t11 . t7) (\t11 . t0))) t9) t4))
)) (((((t4 (\t9 . ((\t10 . (\t11 . (t0 t0))) t2))) t8) (\t9 . (\t10 . ((\t11 . t
10) t9)))) ((\t9 . (\t10 . (\t11 . (\t12 . (\t13 . ((((t10 t8) t6) (\t14 . t14))
 t1)))))) (\t9 . (t8 t1)))) (\t9 . t9)))) t5) t1))) (t3 (((t5 (((\t7 . t4) t3) (
t2 t6))) t6) (\t7 . t4))))) t3) (\t6 . (\t7 . (\t8 . t2))))) t0)))))))))))) ((t0
 t0) t0)) (\t1 . (\t2 . (t2 t2))))) (((\t0 . (\t1 . (t1 t0))) (\t0 . ((\t1 . t0)
 (((\t1 . (\t2 . (\t3 . (\t4 . ((\t5 . t4) t1))))) t0) (((\t1 . t0) t0) (\t1 . t
1)))))) (\t0 . (t0 ((\t1 . ((\t2 . t1) (t1 t1))) t0))))))

So my code in C++ using union-find and compositional inference takes 1.09 seconds on my MacPro to complete type inference on this term.

For comparison "ghc -v2" to show the stages, hangs on type checking and after about 10 minutes crashed the laptop (probably due to memory exhaustion).

Ocaml does better, and finishes very quickly (about 0.6 seconds, but it produces no output, and the type looks wrong to me).

Here is the type I get:

((((((a -> ((((a -> (b -> (((c -> ((((a -> d as f) -> (e -> (((f -> g as t) -> (
(h -> (i -> (e -> j as o)) as p) -> (e -> j as q) as m) as k) -> (k -> (k -> l a
s l) as r) as j) as g) as v) -> m as e) -> j as u) as h) -> n as i) -> o as n) a
s d) as bq) -> (((i -> o as bn) -> ((h -> n) -> (((f -> g) -> (p -> q)) -> j)) a
s br) -> (e -> (k -> r) as by) as bt) as s) -> ((h -> s as bs) -> b)) -> ((t -> 
m) -> r) as bo) as a) -> d as c) -> u) -> n as b) -> (((((((((v -> ((((w -> ((x 
-> ((y -> ((w -> (y -> (z -> ((y -> ba) -> (((w -> bb) -> (bc -> (bd -> ((y -> (
(y -> (be -> ((z -> bf) -> ((bd -> (bg -> (bh -> ((bd -> (bg -> bi as bp) as bk)
 -> bi) as bi))) -> bi) as bj)) as be) -> bj) as bg) -> (bh -> (bk -> bi as cb) 
as bl) as bm))) as bd) -> (bg -> bl as bw)) as bf) as bh)) as z) -> bf as ba) as
 bc) -> (bd -> bm)) as y) -> ba as bb) as x) -> ((z -> bf) -> (bk -> bi as bx)) 
as w) -> (y -> ba)) -> (bc -> e))) -> ((bn -> ((bo -> bp) -> ((z -> bf) -> (a ->
 bo)))) -> ((bq -> (br -> (e -> j))) -> q))) -> (s -> (bs -> b))) -> (bq -> bt) 
as bu) -> ((bu -> bv) -> ((x -> (a -> bo)) -> bw) as bv)) -> r) -> (e -> ((a -> 
((s -> bx) -> ((i -> o) -> r))) -> by))) -> (((h -> ((bz -> ca) -> bt) as bz) ->
 b as ca) -> ((u -> (bx -> (((c -> u) -> n) -> cb))) -> s))) -> b)) -> (((t -> (
(cc -> q as cc) -> q) as ce) -> cd) -> (ce -> cd)) as cd)

But ocaml just gives:

val f : 'a -> 'a as 'a

Which doesn't seem right to me, as I find it hard to believe such a complicated term can be reduced to that typing. ocaml seems to give the same type to all terms over a certain size/complexity? I am puzzled by this.

This is just a simple lambda term with no let-generalisation yet, but looking at the performance results so far, I am not sure I need to do any optimisation yet :-) The source for the term generator is available at https://github.com/keean/Lambda-Gen

Here is a shorter example with typing:

(\t0 . (\t1 . (\t2 . ((t1 t1) t1))))

1. [var t1]          {t1 : a} |- a
2. [var t1]          {t1 : a} |- a
3. [app (1) (2)]     {t1 : (a -> b), t1 : a} |- b
4. [var t1]          {t1 : a} |- a
5. [app (3) (4)]     {t1 : (a -> (b -> c)), t1 : a, t1 : b} |- c
6. [abs t2 (5)]      {t1 : (a -> (b -> c)), t1 : a, t1 : b} |- (d -> c)
7. [abs t1 (6)]      {} |- ((a -> (a -> b) as a) -> (c -> b))
8. [abs t0 (7)]      {} |- (b -> ((a -> (a -> c) as a) -> (d -> c)))

This corresponds to the type given by ocaml for this:

let t = (fun t0 -> (fun t1 -> (fun t2 -> ((t1 t1) t1))))

val t : 'a -> ('b -> 'b -> 'c as 'b) -> 'd -> 'c

Very cool you did the experiment! Short question

Lovely experiment. I had more in mind that the generator would just produce lots of small terms where most would be rejected but feeding large terms is nice too.

Uh. Brasil-Chili is playing I'll respond after the match but a short question first. Did you disable the occurs check in Ocaml? If so, turn it back on, except for sloppily typing terms you'ld like to be programs it doesn't do a lot.

I would call it a publishable result

I would have expected that most more complex terms would be rejected outright by both type checkers.

But since you blew both compilers out of the water I would call it a publishable result. The start of a note. But academics might disagree with that.

Still, if I got it right, you have tested without occurs checks and it would be better if you tested with that; the discrepancy between your typing and Ocaml's is striking though.

A second, right? I am not sure how to interpret that. Which is lack of data, which means more people should run more tests.

I like the shift register

I've just looked at your code and the shift register made me laugh. Bit overkill though. And there are likely better options.

I was thinking about something which mostly would produce simply typed LC terms but that didn't make much sense except that you could see what the cost of more expensive type systems on large terms would be.

It is somewhere more likely the discrepancy between your type system and ocaml's will be explained away as that the mistake is in your system. Absent the occurs check, I think that's even likely since as the terms become larger I would somewhere expect the chance that you end up with the trivial type becomes more likely. That's also interesting from an experimental perspective.

Anyway. Since a type system restricts the language of (closed) LC terms to terms which are sensible to either a human or a machine it's interesting anyway to see what experimental percentage of those terms constitute programs. If academics do rather far-fetched studies of phase transitions on random k-SAT or prime number distributions I wouldn't see why this wouldn't be of equal importance. Maybe you get a negative result back, though.

The other thing is ghc's behavior. If it really blows up on random terms you've likely shown the compiler is too fragile. Compilers need to be very fast and very robust since A) as they grow, people end up writing million Loc programs, and B) the language often ends up as the target of other compilers. They discovered lots of bugs and hidden assumptions because of B) in various C compilers. At minimum, I think you've shown that Haskell can't be used as a target. But I don't know the option you used.

Still looks you've got something interesting.

(Btw. A heads up on disabling the occurs check. Some genius here stated that would be sufficient to prove bisimilar types; well, that ain't true. So don't use it or explain where and why you needed it.)

(We also got here because the idea of a corpus of terms. Most people write test cases or bootstrap because of that. But that has drawbacks too. If you get some users that helps. And if 'Olegs' start writing abusive programs it becomes even more interesting. Generating problems is a bit far fetched but writing only 'id' examples to test a type system is even worse.)

PRNG

The PRNG used is one of the best known, see: http://xorshift.di.unimi.it. It is very fast (so not overkill in that sense) and also has very good statistical properties (it has no systematic failures on the BigCrush tests). Having done some statistical analysis of PRNGs for monte-carlo simulations, I can't bring myself to use anything with unknown properties. The builtin rand function in C/C++ is fairly bad.

I didn't know which one it was

I just recognized it as a shift register and I therefore didn't know the cycle length.

-

double post

Shorter Example of Differences.

Here is the shortest example I can find so far where ocaml (with -rectypes) and my inference algorithm give different types:

(\t0 . ((((((t0 (\t1 . t0)) (\t1 . ((\t2 . t1) (\t2 . (t1 ((\t3 . (\t4 . t3)) (t
0 t1))))))) t0) (\t1 . t0)) (t0 (t0 (t0 (t0 t0))))) ((t0 ((\t1 . t0) ((t0 t0) t0
))) t0)))

ocaml gives this type:

val t : ('a -> 'a as 'a) -> 'a

I get the following derivation:

1.  [var t0]            t0 : {t0 : a} |- a
2.  [var t0]            t0 : {t0 : a} |- a
3.  [abs t1 (2)]        (\t1 . t0) : {t0 : a} |- (b -> a)
4.  [app (1) (3)]       (t0 (\t1 . t0)) : {t0 : ((a -> b) -> c), t0 : b} |- c
5.  [var t1]            t1 : {t1 : a} |- a
6.  [abs t2 (5)]        (\t2 . t1) : {t1 : a} |- (b -> a)
7.  [var t1]            t1 : {t1 : a} |- a
8.  [var t3]            t3 : {t3 : a} |- a
9.  [abs t4 (8)]        (\t4 . t3) : {t3 : a} |- (b -> a)
10. [abs t3 (9)]        (\t3 . (\t4 . t3)) : {} |- (a -> (b -> a))
11. [var t0]            t0 : {t0 : a} |- a
12. [var t1]            t1 : {t1 : a} |- a
13. [app (11) (12)]     (t0 t1) : {t0 : (a -> b), t1 : a} |- b
14. [app (10) (13)]     ((\t3 . (\t4 . t3)) (t0 t1)) : {t0 : (a -> b), t1 : a} |
- (c -> b)
15. [app (7) (14)]      (t1 ((\t3 . (\t4 . t3)) (t0 t1))) : {t0 : (a -> b), t1 :
 ((c -> b) -> d), t1 : a} |- d
16. [abs t2 (15)]       (\t2 . (t1 ((\t3 . (\t4 . t3)) (t0 t1)))) : {t0 : (a -> 
b), t1 : ((c -> b) -> d), t1 : a} |- (e -> d)
17. [app (6) (16)]      ((\t2 . t1) (\t2 . (t1 ((\t3 . (\t4 . t3)) (t0 t1))))) :
 {t0 : (a -> b), t1 : c, t1 : ((d -> b) -> e), t1 : a} |- c
18. [abs t1 (17)]       (\t1 . ((\t2 . t1) (\t2 . (t1 ((\t3 . (\t4 . t3)) (t0 t1
)))))) : {t0 : (((a -> b) -> c) -> b)} |- (((a -> b) -> c) -> ((a -> b) -> c))
19. [app (4) (18)]      ((t0 (\t1 . t0)) (\t1 . ((\t2 . t1) (\t2 . (t1 ((\t3 . (
\t4 . t3)) (t0 t1))))))) : {t0 : ((a -> b) -> ((((c -> d) -> e) -> ((c -> d) -> 
e)) -> f)), t0 : b, t0 : (((c -> d) -> e) -> d)} |- f
20. [var t0]            t0 : {t0 : a} |- a
21. [app (19) (20)]     (((t0 (\t1 . t0)) (\t1 . ((\t2 . t1) (\t2 . (t1 ((\t3 . 
(\t4 . t3)) (t0 t1))))))) t0) : {t0 : ((a -> b) -> ((((c -> d) -> e) -> ((c -> d
) -> e)) -> (f -> g))), t0 : b, t0 : (((c -> d) -> e) -> d), t0 : f} |- g
22. [var t0]            t0 : {t0 : a} |- a
23. [abs t1 (22)]       (\t1 . t0) : {t0 : a} |- (b -> a)
24. [app (21) (23)]     ((((t0 (\t1 . t0)) (\t1 . ((\t2 . t1) (\t2 . (t1 ((\t3 .
 (\t4 . t3)) (t0 t1))))))) t0) (\t1 . t0)) : {t0 : ((a -> b) -> ((((c -> d) -> e
) -> ((c -> d) -> e)) -> (f -> ((g -> h) -> i)))), t0 : b, t0 : (((c -> d) -> e)
 -> d), t0 : f, t0 : h} |- i
25. [var t0]            t0 : {t0 : a} |- a
26. [var t0]            t0 : {t0 : a} |- a
27. [var t0]            t0 : {t0 : a} |- a
28. [var t0]            t0 : {t0 : a} |- a
29. [var t0]            t0 : {t0 : a} |- a
30. [app (28) (29)]     (t0 t0) : {t0 : (a -> b), t0 : a} |- b
31. [app (27) (30)]     (t0 (t0 t0)) : {t0 : (a -> b), t0 : (c -> a), t0 : c} |-
 b
32. [app (26) (31)]     (t0 (t0 (t0 t0))) : {t0 : (a -> b), t0 : (c -> a), t0 : 
(d -> c), t0 : d} |- b
33. [app (25) (32)]     (t0 (t0 (t0 (t0 t0)))) : {t0 : (a -> b), t0 : (c -> a), 
t0 : (d -> c), t0 : (e -> d), t0 : e} |- b
34. [app (24) (33)]     (((((t0 (\t1 . t0)) (\t1 . ((\t2 . t1) (\t2 . (t1 ((\t3 
. (\t4 . t3)) (t0 t1))))))) t0) (\t1 . t0)) (t0 (t0 (t0 (t0 t0))))) : {t0 : ((a 
-> b) -> ((((c -> d) -> e) -> ((c -> d) -> e)) -> (f -> ((g -> h) -> (i -> j))))
), t0 : b, t0 : (((c -> d) -> e) -> d), t0 : f, t0 : h, t0 : (k -> i), t0 : (l -
> k), t0 : (m -> l), t0 : (n -> m), t0 : n} |- j
35. [var t0]            t0 : {t0 : a} |- a
36. [var t0]            t0 : {t0 : a} |- a
37. [abs t1 (36)]       (\t1 . t0) : {t0 : a} |- (b -> a)
38. [var t0]            t0 : {t0 : a} |- a
39. [var t0]            t0 : {t0 : a} |- a
40. [app (38) (39)]     (t0 t0) : {t0 : (a -> b), t0 : a} |- b
41. [var t0]            t0 : {t0 : a} |- a
42. [app (40) (41)]     ((t0 t0) t0) : {t0 : (a -> (b -> c)), t0 : a, t0 : b} |-
 c
43. [app (37) (42)]     ((\t1 . t0) ((t0 t0) t0)) : {t0 : a, t0 : (b -> (c -> d)
), t0 : b, t0 : c} |- a
44. [app (35) (43)]     (t0 ((\t1 . t0) ((t0 t0) t0))) : {t0 : (a -> b), t0 : a,
 t0 : (c -> (d -> e)), t0 : c, t0 : d} |- b
45. [var t0]            t0 : {t0 : a} |- a
46. [app (44) (45)]     ((t0 ((\t1 . t0) ((t0 t0) t0))) t0) : {t0 : (a -> (b -> 
c)), t0 : a, t0 : (d -> (e -> f)), t0 : d, t0 : e, t0 : b} |- c
47. [app (34) (46)]     ((((((t0 (\t1 . t0)) (\t1 . ((\t2 . t1) (\t2 . (t1 ((\t3
 . (\t4 . t3)) (t0 t1))))))) t0) (\t1 . t0)) (t0 (t0 (t0 (t0 t0))))) ((t0 ((\t1 
. t0) ((t0 t0) t0))) t0)) : {t0 : ((a -> b) -> ((((c -> d) -> e) -> ((c -> d) ->
 e)) -> (f -> ((g -> h) -> (i -> (j -> k)))))), t0 : b, t0 : (((c -> d) -> e) ->
 d), t0 : f, t0 : h, t0 : (l -> i), t0 : (m -> l), t0 : (n -> m), t0 : (o -> n),
 t0 : o, t0 : (p -> (q -> j)), t0 : p, t0 : (r -> (s -> t)), t0 : r, t0 : s, t0 
: q} |- k
48. [abs t0 (47)]       (\t0 . ((((((t0 (\t1 . t0)) (\t1 . ((\t2 . t1) (\t2 . (t
1 ((\t3 . (\t4 . t3)) (t0 t1))))))) t0) (\t1 . t0)) (t0 (t0 (t0 (t0 t0))))) ((t0
 ((\t1 . t0) ((t0 t0) t0))) t0))) : {} |- ((((a -> (((a -> b) -> (a -> b) as e) 
-> (b -> ((e -> b as f) -> (c -> (f -> d) as d))) as c) as a) -> b) -> c as b) -
> d)

Unless i can think of a better way, I guess I have to go through the derivation by hand to see if there are any bugs. I have pushed an updated version of the lambda-generator with parameterised syntax so that ocaml can be generated directly.

Generating LC terms is a really bad idea. Or is it?

Well. I am starting to regret a bit I supported this idea. But it seems to deliver results, I am pleased with that, so lets follow it a bit further. But I'll excuse myself for not listing a few academic contrary arguments before you started down this route.

This part of the type seems wrong in the last derivation: "(a -> (((a -> b) -> (a -> b) as e)"

I am not sure what -rectypes does but I am pretty sure it stops the unification when the occurs check signals that the type could be infinitely expanded. Note that this is a small, somewhere philosophical, difference between Algol/Ada/Pascal/C checkers where there everything known about a type is postulated and it proofs a possible recursive relation whereas an ML-like checker derives a recursive relation. Small, but significant, difference you can read in a number of manners.

Anyway, in the last type "e" doesn't occur in the preceding type therefor doesn't signal a recursive relation. Something is either wrong in the presentation or the derivation of your recursive types. Most likely, for some reason it thinks "e" occurs somewhere it doesn't so you have a bug in the occurrence check.

Rational Tree Unification

I am using rational tree unification, which does not need an occurs check. A rational tree can have a self reference in a leaf node, and rational tree unification is decidable and sound (within the context of the objects that inhabit the universe of rational trees). RT unification was chosen as the default unification for Prolog-II because it is a big improvement over unification in a Herbrand universe with no occurs check (although it is not actually sound for deduction in a Herbrand universe, a post unification cycle-check can be used to check soundness of deduction).

So in my checker, the types are internally rational trees, and the only operation performed on them is RT unification. So as long as there are no 'bugs' this produces correct type graphs. For printing there is a final mu-conversion step, and depending on how you perform the cycle check and de-looping you get slightly different types. As the tree can reference other parts, that don't immediately form a loop (A references an ancestor of B, and B references an ancestor of A) and can be broken in different places.

So lets look at this type:

((((
    a -> (((a -> b) -> (a -> b) as e) -> (b -> ((e -> b as f) -> (c -> (f -> d) as d))) as c) as a
) -> b) -> c as b) -> d)

The thing to note is that the whole "(a -> b) -> (a -> b) as e" is inside a block referenced 'as a', so 'a' contains "a -> (((a -> b) -> (a -> b) as e) -> (b -> ((e -> b as f) -> (c -> (f -> d) as d))) as c) as a" and this clearly contains an 'e'. So "(a -> b) -> (a -> b)" obviously contains an 'e'. There is no way I am aware of to break all the cycles so that this does not happen. If we expand 'b' inside 'e' one more time we have the same problem at the next expansion.

However this is just a 'print' format of the internal rational-tree, and I am open to suggestions as to how this could be done better. I have written the mu-conversion so that it generates the same types as ocaml, for the types where ocaml actually generates what seems a reasonable type. For example, a slightly distorted Y combinator:

ocaml gives:

let t = fun f -> (fun x -> f (fun y -> x x y)) (fun x -> f (fun y -> y x y))

val t :
  (((('b -> ('a -> 'd as 'c) as 'b) -> 'a -> 'd as 'a) -> 'd) -> 'c) -> 'c

Notice ocaml generates "a -> d as c" which is the same as above and looks like it does not contain 'c', apart from there is a 'c' inside 'a'. I get:

\f . (\x . f (\y . x x y)) (\x . f (\y . y x y))

1.  [var f]            f : {f : a} |- a
2.  [var x]            x : {x : a} |- a
3.  [var x]            x : {x : a} |- a
4.  [app (2) (3)]      (x x) : {x : (a -> b), x : a} |- b
5.  [var y]            y : {y : a} |- a
6.  [app (4) (5)]      ((x x) y) : {x : (a -> (b -> c)), x : a, y : b} |- c
7.  [abs y (6)]        (\y . ((x x) y)) : {x : (a -> (b -> c)), x : a} |- (b -> 
c)
8.  [app (1) (7)]      (f (\y . ((x x) y))) : {f : ((a -> b) -> c), x : (d -> (a
 -> b)), x : d} |- c
9.  [abs x (8)]        (\x . (f (\y . ((x x) y)))) : {f : ((a -> b) -> c)} |- ((
d -> (a -> b) as d) -> c)
10. [var f]            f : {f : a} |- a
11. [var y]            y : {y : a} |- a
12. [var x]            x : {x : a} |- a
13. [app (11) (12)]    (y x) : {x : a, y : (a -> b)} |- b
14. [var y]            y : {y : a} |- a
15. [app (13) (14)]    ((y x) y) : {x : a, y : (a -> (b -> c)), y : b} |- c
16. [abs y (15)]       (\y . ((y x) y)) : {x : a} |- ((a -> (b -> c) as b) -> c)
17. [app (10) (16)]    (f (\y . ((y x) y))) : {f : (((b -> (a -> c) as a) -> c) 
-> d), x : b} |- d
18. [abs x (17)]       (\x . (f (\y . ((y x) y)))) : {f : (((b -> (a -> c) as a)
 -> c) -> d)} |- (b -> d)
19. [app (9) (18)]     ((\x . (f (\y . ((x x) y)))) (\x . (f (\y . ((y x) y)))))
 : {f : ((a -> b) -> c), f : ((((d -> (a -> b) as d) -> (e -> f) as e) -> f) -> 
(a -> b))} |- c
20. [abs f (19)]       (\f . ((\x . (f (\y . ((x x) y)))) (\x . (f (\y . ((y x) 
y)))))) : {} |- (((((a -> (b -> d as c) as a) -> (b -> d) as b) -> d) -> c) -> c
)

So even for complex recursions in smaller types I get exactly the same types as ocaml. At some complexity level either I have a bug, that does not affect simpler types, or ocaml does. I cannot see anything wrong in the compositional derivations above, and I cannot get anything similar showing the type derivation from ocaml.

A lacking substitution in either the LHS or RHS

This somehow felt like the most likely mistake during the last football match. I am unaware of rational tree unification but I am a bit cautious about the types Ocaml derives with recursive types.

I'll look it up.

Union-Find

With union-find, unification works directly on the type graphs without generating substitutions. Nodes in the type graph can be considered to be members of equivalence classes, and unification merges two equivalence classes. Currently the implementation of unification throws an exception when two terms do not unify, but the previous type checker I wrote just went ahead and merged the equivalence classes, and you then perform a validity check on the equivalence classes at the end (a valid equivalence class has all the non-variables having identical types). I am working through the derivation by hand at the moment, and the types I am generating look correct so far.

Seems reasonable

What I forgot to say, your notation seems reasonable. Silly, should have thought of the outward cycles.

OCaml type looks wrong

I have manually validated the type derivation from my compositional algorithm, and it looks correct to me. Step (48) requires the unification of:

t0 : ((a -> b) -> ((((c -> d) -> e) -> ((c -> d) -> e)) -> (f -> ((g -> h) -> (i -> (j -> k)))))),
t0 : b, 
t0 : (((c -> d) -> e) -> d), 
t0 : f, 
t0 : h, 
t0 : (l -> i), 
t0 : (m -> l), 
t0 : (n -> m), 
t0 : (o -> n),
t0 : o, 
t0 : (p -> (q -> j)), 
t0 : p, 
t0 : (r -> (s -> t)), 
t0 : r, 
t0 : s, t0

unifies to (note: variables may be renamed from above):
((a -> (((a -> b) -> (a -> b) as e) -> (b -> ((e -> b as f) -> (c -> (f -> d) as d))) as c) as a) -> b) -> c as b

leading to typing this term:
(\t0 . ((((((t0 (\t1 . t0)) (\t1 . ((\t2 . t1) (\t2 . (t1 ((\t3 . (\t4 . t3)) (t
0 t1))))))) t0) (\t1 . t0)) (t0 (t0 (t0 (t0 t0))))) ((t0 ((\t1 . t0) ((t0 t0) t0
))) t0)))

as:
((((a -> (((a -> b) -> (a -> b) as e) -> (b -> ((e -> b as f) -> (c -> (f -> d) as d))) as c) as a) -> b) -> c as b) -> d)

Each of these definitions of 't0' corresponds to one of the occurrences of 't0' in (47). Maybe I am missing something obvious, but I don't see how ocaml gets:

let t = (fun t0 -> ((((((t0 (fun t1 -> t0)) (fun t1 -> ((fun t2 -> t1) (fun t2
-> (t1 ((fun t3 -> (fun t4 -> t3)) (t0 t1))))))) t0) (fun t1 -> t0)) (t0 (t0 (t0
(t0 t0))))) ((t0 ((fun t1 -> t0) ((t0 t0) t0))) t0)));;

val t : ('a -> 'a as 'a) -> 'a

as the type of this term. Does anyone have anything to support ocaml's typing of this term?

Is it possible to trivialize further by an expansion?

If you derive this type

((((a -> (((a -> b) -> (a -> b) as e) -> (b -> ((e -> b as f) -> (c -> (f -> d) as d))) as c) as a) -> b) -> c as b) -> d)

I wonder whether you can trivialize further by substituting one of the LHSs for a RHS variable and then trivialize the type further.

Just Notation

The actual type is a rational tree, and the type as written down is just a notation for that tree. So in effect all the 'b's (including the 'as b' ) are the same graph node. So what you have is a graph topology, and that cannot be simplified - unless there are some extra constraints that have not been included?

Informal reasoning about t0=t1, sub(t0, t1) t2, and Eq(t0, t1)

I didn't find enough information about rational trees to make an informed comment on your types. And, honestly, you're at a point of specialization I don't feel much like investigating further since there is no pay off for me anywhere.

So, all my comments from now are uninformed wild guesses. But hypothesizing, there is an informal relation between a type constraint t0=t1, a substitution in a type t2 sub(t0, t1) t2, and an equivalence relation on types Eq(t0, t1).

Without any information, I assume you have an equivalence relation Eq on type trees where the equivalence on the graph, where the graph is a representational implementation issue, can be read as an equality constraint and used as a substitution. But I am not entirely sure about it.

But as I said, you're at a point of specialization I feel uncomfortable about discussing or investigating further.

Shorter Example

Found a shorter example, ocaml gives:

let t = (fun t0 -> (fun t1 -> (fun t2 -> ((((fun t3 -> (t3 (fun t4 -> (fun t5 ->
 t2)))) t0) t0) ((t2 (fun t3 -> t1)) (((((fun t3 -> t3) t2) t1) t1) (t2 (t0 (fun
 t3 -> (t0 t2))))))))));;

val t : ('a -> 'a as 'a) -> 'a -> 'a -> 'a

I get:

(\t0 . (\t1 . (\t2 . ((((\t3 . (t3 (\t4 . (\t5 . t2)))) t0) t0) ((t2 (\t3 . t1))
 (((((\t3 . t3) t2) t1) t1) (t2 (t0 (\t3 . (t0 t2))))))))))

1.  [var t3]            t3 : {t3 : a} |- a
2.  [var t2]            t2 : {t2 : a} |- a
3.  [abs t5 (2)]        (\t5 . t2) : {t2 : a} |- (b -> a)
4.  [abs t4 (3)]        (\t4 . (\t5 . t2)) : {t2 : a} |- (b -> (c -> a))
5.  [app (1) (4)]       (t3 (\t4 . (\t5 . t2))) : {t2 : a, t3 : ((b -> (c -> a))
 -> d)} |- d
6.  [abs t3 (5)]        (\t3 . (t3 (\t4 . (\t5 . t2)))) : {t2 : a} |- (((b -> (c
 -> a)) -> d) -> d)
7.  [var t0]            t0 : {t0 : a} |- a
8.  [app (6) (7)]       ((\t3 . (t3 (\t4 . (\t5 . t2)))) t0) : {t0 : ((a -> (b -
> c)) -> d), t2 : c} |- d
9.  [var t0]            t0 : {t0 : a} |- a
10. [app (8) (9)]       (((\t3 . (t3 (\t4 . (\t5 . t2)))) t0) t0) : {t0 : ((a ->
 (b -> c)) -> (d -> e)), t0 : d, t2 : c} |- e
11. [var t2]            t2 : {t2 : a} |- a
12. [var t1]            t1 : {t1 : a} |- a
13. [abs t3 (12)]       (\t3 . t1) : {t1 : a} |- (b -> a)
14. [app (11) (13)]     (t2 (\t3 . t1)) : {t1 : a, t2 : ((b -> a) -> c)} |- c
15. [var t3]            t3 : {t3 : a} |- a
16. [abs t3 (15)]       (\t3 . t3) : {} |- (a -> a)
17. [var t2]            t2 : {t2 : a} |- a
18. [app (16) (17)]     ((\t3 . t3) t2) : {t2 : a} |- a
19. [var t1]            t1 : {t1 : a} |- a
20. [app (18) (19)]     (((\t3 . t3) t2) t1) : {t1 : a, t2 : (a -> b)} |- b
21. [var t1]            t1 : {t1 : a} |- a
22. [app (20) (21)]     ((((\t3 . t3) t2) t1) t1) : {t1 : a, t1 : b, t2 : (a -> 
(b -> c))} |- c
23. [var t2]            t2 : {t2 : a} |- a
24. [var t0]            t0 : {t0 : a} |- a
25. [var t0]            t0 : {t0 : a} |- a
26. [var t2]            t2 : {t2 : a} |- a
27. [app (25) (26)]     (t0 t2) : {t0 : (a -> b), t2 : a} |- b
28. [abs t3 (27)]       (\t3 . (t0 t2)) : {t0 : (a -> b), t2 : a} |- (c -> b)
29. [app (24) (28)]     (t0 (\t3 . (t0 t2))) : {t0 : ((a -> b) -> c), t0 : (d ->
 b), t2 : d} |- c
30. [app (23) (29)]     (t2 (t0 (\t3 . (t0 t2)))) : {t0 : ((a -> b) -> c), t0 : 
(d -> b), t2 : (c -> e), t2 : d} |- e
31. [app (22) (30)]     (((((\t3 . t3) t2) t1) t1) (t2 (t0 (\t3 . (t0 t2))))) : 
{t0 : ((a -> b) -> c), t0 : (d -> b), t1 : e, t1 : f, t2 : (e -> (f -> (g -> h))
), t2 : (c -> g), t2 : d} |- h
32. [app (14) (31)]     ((t2 (\t3 . t1)) (((((\t3 . t3) t2) t1) t1) (t2 (t0 (\t3
 . (t0 t2)))))) : {t0 : ((a -> b) -> c), t0 : (d -> b), t1 : e, t1 : f, t1 : g, 
t2 : ((h -> e) -> (i -> j)), t2 : (f -> (g -> (k -> i))), t2 : (c -> k), t2 : d}
 |- j
33. [app (10) (32)]     ((((\t3 . (t3 (\t4 . (\t5 . t2)))) t0) t0) ((t2 (\t3 . t
1)) (((((\t3 . t3) t2) t1) t1) (t2 (t0 (\t3 . (t0 t2))))))) : {t0 : ((a -> (b ->
 c)) -> (d -> (e -> f))), t0 : d, t0 : ((g -> h) -> i), t0 : (j -> h), t1 : k, t
1 : l, t1 : m, t2 : c, t2 : ((n -> k) -> (o -> e)), t2 : (l -> (m -> (p -> o))),
 t2 : (i -> p), t2 : j} |- f
34. [abs t2 (33)]       (\t2 . ((((\t3 . (t3 (\t4 . (\t5 . t2)))) t0) t0) ((t2 (
\t3 . t1)) (((((\t3 . t3) t2) t1) t1) (t2 (t0 (\t3 . (t0 t2)))))))) : {t0 : ((c 
-> (d -> ((e -> f) -> (g -> (a -> g as b) as a)))) -> (h -> (b -> i))), t0 : h, 
t0 : ((j -> k) -> (e -> f)), t0 : (((e -> f) -> a) -> k), t1 : f, t1 : (e -> f),
 t1 : g} |- (((e -> f) -> a) -> i)
35. [abs t1 (34)]       (\t1 . (\t2 . ((((\t3 . (t3 (\t4 . (\t5 . t2)))) t0) t0)
 ((t2 (\t3 . t1)) (((((\t3 . t3) t2) t1) t1) (t2 (t0 (\t3 . (t0 t2))))))))) : {t
0 : ((d -> (e -> ((f -> a as a) -> (a -> (b -> a as c) as b)))) -> (g -> (c -> h
))), t0 : g, t0 : ((i -> j) -> a), t0 : ((a -> b) -> j)} |- (a -> ((a -> b) -> h
))
36. [abs t0 (35)]       (\t0 . (\t1 . (\t2 . ((((\t3 . (t3 (\t4 . (\t5 . t2)))) 
t0) t0) ((t2 (\t3 . t1)) (((((\t3 . t3) t2) t1) t1) (t2 (t0 (\t3 . (t0 t2)))))))
))) : {} |- ((((((a -> b as c) -> a as b) -> a as a) -> (c -> (a -> c))) -> (d -
> (b -> a)) as d) -> (a -> ((a -> c) -> a)))

Looking at the 'annot' file output by ocaml, I can see that t0, t1, t2, t3, t4 and t5 have all been given the type 'a by ocaml, which doesn't make any sense as we can clearly see the term (t0 t2) which would mean t0 should at least have the type a -> b.

The trivial type a=a->a

Well. That's supported by the trivial type which you can informally read as "any value is also a function" and I find it reasonable to expect more complex terms to have the trivial type.

I still think you should try to expand some of the equalities by hand and see whether that gives you a set of equations such that more, or most, variables unify.

(I am still watching soccer and now experimenting how to create Caipirinha.)

(Shouldn't the Ocaml type trivialize further?)

No type variables left.

So what I have noticed is that there are no type variables left in the type produced by my algorithm, and so a, b, c and d are points in a topology made only of arrows. So I suspect I am just missing some simplification steps in the unification algorithm, required for cyclic types. The easy option would be to turn the occurs check back on, as I only disabled it to experiment with these randomly generated lambda terms. I want to understand the problem a bit better first though, and see what the fix looks like.

Not sure

No idea really. The ocaml type looks plain wrong since if a=a->a then typing the term with "a" should be enough, it looks trivially minimizable unless I understood everything wrong.

I am not sure. You also trivially have that your end result type equals itself and I thought that by that equality and an expansion on one side you could possibly unify more variables and end up with the trivial type too. Depends on how robust your algorithm is, and whether it maintains some constraint or invariant such that extra unifications aren't generated by expanding over an equality between two "identical" types.

I didn't give it much though but randomly expanded "a" in your type in a text editor as a test but that didn't give anything extra.

No idea. Ocaml looks plain wrong and I don't understand your typing algorithm. Maybe you should ask (email) someone.

Uh, topology?

More shifting goalposts? What is a topology? Last time I've seen people use that nomenclature in relation to types that went rather wrong. You have syntax here, let's not confuse stuff.

Homotomy Type Theory

Well, I'm not talking about Homotomy Type Theory, although that looks interesting. I am just talking about the structure of the regular tree.

Type Graph

I have used graphviz, to visualise the type graph of the following lambda expression:

(\t0 . (\t1 . (\t2 . ((((\t3 . (t3 (\t4 . (\t5 . t2)))) t0) t0) ((t2 (\t3 . t1)) (((((\t3 . t3) t2) t1) t1) (t2 (t0 (\t3 . (t0 t2))))))))))

Here is the dot description:

digraph type {
	arrow0;
	arrow1;
	arrow2;
	arrow3;
	arrow4;
	arrow5;
	arrow5 -> arrow3 [color=red];
	arrow5 -> arrow4 [color=green];
	arrow4 -> arrow5 [color=red];
	arrow4 -> arrow3 [color=green];
	arrow3 -> arrow4 [color=red];
	arrow3 -> arrow3 [color=green];
	arrow6;
	arrow7;
	arrow7 -> arrow3 [color=red];
	arrow7 -> arrow5 [color=green];
	arrow6 -> arrow5 [color=red];
	arrow6 -> arrow7 [color=green];
	arrow2 -> arrow3 [color=red];
	arrow2 -> arrow6 [color=green];
	arrow8;
	arrow9;
	arrow9 -> arrow4 [color=red];
	arrow9 -> arrow3 [color=green];
	arrow8 -> arrow1 [color=red];
	arrow8 -> arrow9 [color=green];
	arrow1 -> arrow2 [color=red];
	arrow1 -> arrow8 [color=green];
	arrow10;
	arrow11;
	arrow11 -> arrow7 [color=red];
	arrow11 -> arrow3 [color=green];
	arrow10 -> arrow3 [color=red];
	arrow10 -> arrow11 [color=green];
	arrow0 -> arrow1 [color=red];
	arrow0 -> arrow10 [color=green];
}

You can convert to a pdf with "dot -Tpdf", to view.

What puzzles me is this graph does not look reducible to me. I have read that lambda terms should be reducible to "D = D -> D", but that is not what is happening here. Is that work ignoring outer-cycles?

Looks so

I would expect that as random terms grow larger the number of unifications on a HM-like system without occurs check somewhere "overconstrain" the type and you end up with the trivial type.

The ocaml type seemed to support that position and it was therefor likely, but not given, that your system has a bug, as in lacking a number of unifications.

But it is not a given. Maybe your type is right and the ocaml type is wrong. Maybe they are just different systems.

Solved it.

I have solved it, or at least I know how to modify the algorithm to produce the same results as ocaml. Here is the new derivation for the above term:

1.  [var t3]            t3 : {t3 : a} |- a
2.  [var t2]            t2 : {t2 : a} |- a
3.  [abs t5 (2)]        (\t5 . t2) : {t2 : a} |- (b -> a)
4.  [abs t4 (3)]        (\t4 . (\t5 . t2)) : {t2 : a} |- (b -> (c -> a))
5.  [app (1) (4)]       (t3 (\t4 . (\t5 . t2))) : {t2 : a, t3 : ((b -> (c -> a))
 -> d)} |- d
6.  [abs t3 (5)]        (\t3 . (t3 (\t4 . (\t5 . t2)))) : {t2 : a} |- (((b -> (c
 -> a)) -> d) -> d)
7.  [var t0]            t0 : {t0 : a} |- a
8.  [app (6) (7)]       ((\t3 . (t3 (\t4 . (\t5 . t2)))) t0) : {t0 : ((a -> (b -
> c)) -> d), t2 : c} |- d
9.  [var t0]            t0 : {t0 : a} |- a
10. [app (8) (9)]       (((\t3 . (t3 (\t4 . (\t5 . t2)))) t0) t0) : {t0 : ((a ->
 (b -> c)) -> (d -> e)), t0 : d, t2 : c} |- e
11. [var t2]            t2 : {t2 : a} |- a
12. [var t1]            t1 : {t1 : a} |- a
13. [abs t3 (12)]       (\t3 . t1) : {t1 : a} |- (b -> a)
14. [app (11) (13)]     (t2 (\t3 . t1)) : {t1 : a, t2 : ((b -> a) -> c)} |- c
15. [var t3]            t3 : {t3 : a} |- a
16. [abs t3 (15)]       (\t3 . t3) : {} |- (a -> a)
17. [var t2]            t2 : {t2 : a} |- a
18. [app (16) (17)]     ((\t3 . t3) t2) : {t2 : a} |- a
19. [var t1]            t1 : {t1 : a} |- a
20. [app (18) (19)]     (((\t3 . t3) t2) t1) : {t1 : a, t2 : (a -> b)} |- b
21. [var t1]            t1 : {t1 : a} |- a
22. [app (20) (21)]     ((((\t3 . t3) t2) t1) t1) : {t1 : a, t1 : b, t2 : (a -> 
(b -> c))} |- c
23. [var t2]            t2 : {t2 : a} |- a
24. [var t0]            t0 : {t0 : a} |- a
25. [var t0]            t0 : {t0 : a} |- a
26. [var t2]            t2 : {t2 : a} |- a
27. [app (25) (26)]     (t0 t2) : {t0 : (a -> b), t2 : a} |- b
28. [abs t3 (27)]       (\t3 . (t0 t2)) : {t0 : (a -> b), t2 : a} |- (c -> b)
29. [app (24) (28)]     (t0 (\t3 . (t0 t2))) : {t0 : ((a -> b) -> c), t0 : (d ->
 b), t2 : d} |- c
30. [app (23) (29)]     (t2 (t0 (\t3 . (t0 t2)))) : {t0 : ((a -> b) -> c), t0 : 
(d -> b), t2 : (c -> e), t2 : d} |- e
31. [app (22) (30)]     (((((\t3 . t3) t2) t1) t1) (t2 (t0 (\t3 . (t0 t2))))) : 
{t0 : ((a -> b) -> c), t0 : (d -> b), t1 : e, t1 : f, t2 : (e -> (f -> (g -> h))
), t2 : (c -> g), t2 : d} |- h
32. [app (14) (31)]     ((t2 (\t3 . t1)) (((((\t3 . t3) t2) t1) t1) (t2 (t0 (\t3
 . (t0 t2)))))) : {t0 : ((a -> b) -> c), t0 : (d -> b), t1 : e, t1 : f, t1 : g, 
t2 : ((h -> e) -> (i -> j)), t2 : (f -> (g -> (k -> i))), t2 : (c -> k), t2 : d}
 |- j
33. [app (10) (32)]     ((((\t3 . (t3 (\t4 . (\t5 . t2)))) t0) t0) ((t2 (\t3 . t
1)) (((((\t3 . t3) t2) t1) t1) (t2 (t0 (\t3 . (t0 t2))))))) : {t0 : ((a -> (b ->
 c)) -> (d -> (e -> f))), t0 : d, t0 : ((g -> h) -> i), t0 : (j -> h), t1 : k, t
1 : l, t1 : m, t2 : c, t2 : ((n -> k) -> (o -> e)), t2 : (l -> (m -> (p -> o))),
 t2 : (i -> p), t2 : j} |- f
34. [abs t2 (33)]       (\t2 . ((((\t3 . (t3 (\t4 . (\t5 . t2)))) t0) t0) ((t2 (
\t3 . t1)) (((((\t3 . t3) t2) t1) t1) (t2 (t0 (\t3 . (t0 t2)))))))) : {t0 : ((c 
-> (d -> ((e -> f) -> (g -> ('a -> g as 'b) as 'a)))) -> (h -> ('b -> i))), t0 :
 h, t0 : ((j -> k) -> (e -> f)), t0 : (((e -> f) -> 'a) -> k), t1 : f, t1 : (e -
> f), t1 : g} |- (((e -> f) -> 'a) -> i)
35. [abs t1 (34)]       (\t1 . (\t2 . ((((\t3 . (t3 (\t4 . (\t5 . t2)))) t0) t0)
 ((t2 (\t3 . t1)) (((((\t3 . t3) t2) t1) t1) (t2 (t0 (\t3 . (t0 t2))))))))) : {t
0 : ((d -> (e -> ((f -> 'a as 'a) -> ('a -> ('b -> 'a as 'c) as 'b)))) -> (g -> 
('c -> h))), t0 : g, t0 : ((i -> j) -> 'a), t0 : (('a -> 'b) -> j)} |- ('a -> ((
'a -> 'b) -> h))
36. [abs t0 (35)]       (\t0 . (\t1 . (\t2 . ((((\t3 . (t3 (\t4 . (\t5 . t2)))) 
t0) t0) ((t2 (\t3 . t1)) (((((\t3 . t3) t2) t1) t1) (t2 (t0 (\t3 . (t0 t2)))))))
))) : {} |- (('a -> 'a as 'a) -> ('a -> ('a -> 'a)))

I think this can be classed as a bug. It was a small change to unification that in the non-cyclic case makes no difference. What I was doing was unifying the sub-terms of an application, but leaving the arrows distinct. I was terminating unification by bisimulation. (IE it finds a cycle when unifying the same nodes for a second time). Now what I am doing is unifying the arrow itself (IE choosing a representative node for the arrow's equivalence class). I got this correct when I wrote a previous type inference algorithm in Haskell that used graphs, equivalence sets and constraints, a bit like this http://dspace.library.uu.nl/handle/1874/23951. When writing this algorithm I had assumed that it made no difference, and of course with the occurs-check enabled it didn't. Re-reading that paper reminded me of the constraint based graph solver I wrote, and it became immediately clear that I needed to have a representative node for all equivalence classes including arrows.

So mystery solved, and I can also improve the performance of the cycle detection in unification, as now we only need to check for reaching the same node again (whichever node we chose as the representative node).

Here's an updated type graph:

digraph type {
	arrow0;
	arrow1;
	arrow1 -> arrow1 [color=green];
	arrow1 -> arrow1 [color=red];
	arrow2;
	arrow3;
	arrow3 -> arrow1 [color=green];
	arrow3 -> arrow1 [color=red];
	arrow2 -> arrow1 [color=green];
	arrow2 -> arrow3 [color=red];
	arrow0 -> arrow1 [color=green];
	arrow0 -> arrow2 [color=red];
}

red = co-domain (port), green = domain (starboard).

Look at it after the match

I look at it after the current match.

But I doubt you do a bisimulation proof (like an Algol/C/Pascal-like type checker does.) I haven't seen it in academic studies but I am a sloppy reader.

The way I have seen it work in a small number of type checkers for imperative languages they often keep a set of type equivalences defined over type terms which are often entries in the symbol table. If you have an equality from a type definition, i.e. Eq(t0, t1 -> (t2, t0)), it is added as a fact, sometimes pedantically named a postulate, to the type equality table. If you later, during type checking try to prove (by inserting) the same equality and it is found in the Eq table it is trivially discharged as a correct type.

This is a bisimulation proof. A fact is true since it was postulated previously. Most academic type checkers don't work that way so I doubt it gives bisimulation proofs.

What it did and what it will do.

What it did, but probably will not any more is keep a set of products of node addresses, so unifying nodes X and Y would enter (X, Y) in the set. It iterates through the two type trees in parallel, obviously it stops if a unification fails. It also stops if it encounters a pair to be unified that is in the visited product set.

This seems to be bisimulation, as we have two state machines that descend each type in parallell, and continue if the states of each are the same (the nodes unify or are in the visited product set).

The 'facts' are put into a disjoint set so type unification is the 'unify' operation on the types in the disjoint set, and discharged by the 'find' operation on the disjoint set. So it sounds lot like your description.

In any case whether this is bisimulation or not is not important. What is important is than unification termination is easier to detect now because we unify X and Y for arrow types, so instead of having to put the product (X, Y) in the visited set we only have to put X or Y (whichever is chosen as the canonical representative of that equivalence set).

Edit: In fact, I don't think it needs the 'done' set at all, as we record the fact that X := Y or Y := X in the fact database, then I think simply checking deref(X) != deref(Y) is enough to catch any cycle.

I stand corrected.

I stand corrected.

It's on GitHub

The source, including the latest fixes is on GitHub, if you wanted to see for yourself (don't take my word for it) https://github.com/keean/Compositional-Typing-Inference. Its all implemented using the visitor-pattern, which I like because it keeps each algorithm separate, with its own state in a visitor sub-class, the only thing not implemented as a visitor is the parser, which uses the my parser combinator library.

Timing Type Inference

So returning to the timing question, my implementation takes 0.8s to infer the type of the long randomly generated lambda term above, compared to ocaml which takes 0.3s. Of course I am doing compositional typing, which can has typings for open terms, but its not really optimised at the moment. I will do some more test to see if this is a linear 3x speed deficit, if it is then its significant, but not really a serious problem.

One of the main reasons my algorithm is slower, is that it uses fresh instantiations of typings at every stage, so that the derivation can be saved in the AST. This is because we don't want the unifications to affect the types further down the tree when combining subtrees compositionally. If I use something like de-Bruijn numbers for the type variables in typings, I won't have to instantiate with fresh variables, and will save a lot of type graph node mallocs.

Probably fast enough

Ocaml abandoned HM style inference ages ago in favor of a more performant algorithm I don't know much about. But if it works, it is probably fast enough. A factor two on "ridiculously" large terms isn't bad.

I scanned your source code but the "semantic gap" between your tree unification algorithm and the source code is at the moment too big for me to make much sense out of it. And I don't know how well it will generalize to a specific algorithm for a more elaborate type system.

Generalising HM

Re-reading this paper http://webdoc.sub.gwdg.de/ebook/serien/ah/UU-CS/2002-031.pdf was very helpful in understanding the bug in unification for my type inference algorithm.

You can see the syntax-directed rules are very similar, and the 'monomorphic context', very closely resembles the (bottom-up) typing constraints.

The one interesting difference is that I don't need to annotate the let 'generalisation' constraint with the set of monomorphic (free) variables, as the order in which the constraints are discharged is determined by the bottom-up composition, and by deferring the use of any monomorphic constraints relating to free variables in the let generalised term it automatically satisfies the purpose of the monomorphic set on the generalisation constraint.

So the type-system and proof of soundness in this paper applies to closed terms using my algorithm, providing you accept that the restrictions on let generalisation are achieved structurally.

What you get that the above don't have is principal typings, typings for open terms, and compositional inference, that allows complete type derivations to be written down. It was hard to understand what was going on in ocaml's type inference when you simply could not type many of the intermediate steps (the open terms).

An open question is that CHR seem make a good basis for understanding any type inference algorithm with quantified types, and clearly this is just at the boundary of intersection types (allowing multiple constraints on the same type variable), how could CHR be extended to provide a similar framework for intersection type systems? That would seem to need some combination of CHR and expansion-variables, has any work been done on this?

I roughly implemented the same algorithm

I roughly implemented the same algorithm and you can see the (sometimes annotated) sources here.

But the compiler was too slow, I never got to debugging the typing algorithm, I deemed the bottom up construction of type constraints too slow (at least in the manner I did it), and I wanted a system with more elaboration for interfaces and user-defined cyclic types.

I decided the route to a performant compiler would be too long as the language lacks the primitives to make it into a direction with bearable speed and stopped working on it.

I'll look at your algorithm again.

I am thinking of a C based LLVM or GCC front end to it

That's what I would like to do. Lots of work though.