Djinn, a theorem prover in Haskell, for Haskell.

Lennart Augustsson announced Djinn on the Haskell mailing list recently.

He included this demonstration:

I've written a small program that takes a (Haskell) type
and gives you back a function of that type if one exists.
It's kind of fun, so I thought I'd share it.

It's probably best explained with a sample session.

   calvin% djinn
   Welcome to Djinn version 2005-12-11.
   Type :h to get help.
# Djinn is interactive if not given any arguments.
# Let's see if it can find the identity function.
   Djinn> f ? a->a
   f :: a -> a
   f x1 = x1
# Yes, that was easy.  Let's try some tuple munging.
   Djinn> sel ? ((a,b),(c,d)) -> (b,c)
   sel :: ((a, b), (c, d)) -> (b, c)
   sel ((_, v5), (v6, _)) = (v5, v6)

# We can ask for the impossible, but then we get what we
# deserve.
   Djinn> cast ? a->b
   -- cast cannot be realized.

# OK, let's be bold and try some functions that are tricky to write:
# return, bind, and callCC in the continuation monad
   Djinn> type C a = (a -> r) -> r
   Djinn> returnC ? a -> C a
   returnC :: a -> C a
   returnC x1 x2 = x2 x1

   Djinn> bindC ? C a -> (a -> C b) -> C b
   bindC :: C a -> (a -> C b) -> C b
   bindC x1 x2 x3 = x1 (\ c15 -> x2 c15 (\ c17 -> x3 c17))

   Djinn> callCC ? ((a -> C b) -> C a) -> C a
   callCC :: ((a -> C b) -> C a) -> C a
   callCC x1 x2 = x1 (\ c15 _ -> x2 c15) (\ c11 -> x2 c11)
# Well, poor Djinn has a sweaty brow after deducing the code
# for callCC so we had better quit.
   Djinn> :q

To play with Djinn do a
darcs get
or get
Then just type make. (You need a Haskell 98 implementation and
some libraries.) And then start djinn.

For the curious, Djinn uses a decision procedure for intuitionistic
propositional calculus due to Roy Dyckhoff. It's a variation of
Gentzen's LJ system. This means that (in theory) Djinn will always
find a function if one exists, and if one doesn't exist Djinn will
terminate telling you so.

This is the very first version, so expect bugs. :)

Don Stewart wrote a lambdabot plugin for Djinn a few hours later.

15:39:01  @djinn a -> b -> a
15:39:02  x :: a -> b -> a
15:39:02  x x1 _ = x1
15:39:11  @djinn (a -> b -> c) -> ((a,b) -> c)
15:39:11  x :: (a -> b -> c) -> (a, b) -> c
15:39:11  x x1 (v3, v4) = x1 v3 v4
15:39:27  @djinn (a -> b) -> (c -> b) -> Either a c -> b
15:39:27  x :: (a -> b) -> (c -> b) -> Either a c -> b
15:39:27  x x1 x2 x3 = case x3 of
15:39:27       Left l4 -> x1 l4
15:39:27       Right r5 -> x2 r5
15:40:06  @djinn a -> [a] -> [a]
15:40:07  x :: a -> [a] -> [a]
15:40:07  x _ x2 = x2
15:40:08  @help djinn
15:40:09   @djinn 
15:40:09  Generates Haskell code from a type.

Djinn has proven to be much fun on #haskell.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Proof programming?

I've vaguely heard of proof programming, as typified by Efficient Proof Programming, but I don't know if anyone uses such a thing, or whether it is useful.

Specifically, I'd like to produce a proof of an algorithm with a proof assistant, and then generate source code from that. So far I've only found NuPRL, is there anything else?

I do think this approach could be useful, if generating a proof doesn't take too long, and if the generated programs are relatively efficient.

Coq and a bit of a coincidence

There is some support to extract programs from Coq proofs. Also, the title of one of the papers I mentioned in another thread is "Extracting Evaluators from Proofs of Weak Head Normalization".

Very cool!

Very cool!

takes off his ignorant-fanboi-hat

But seriously, folks. I'd love to know how Djinn works, but I'm a newcomer to PL Theory and still far from getting to grips with all this type theory stuff.

So is there anybody out there who thinks they can explain Djinn to me? (Lots of hand-waving, please! I still need trainer-wheels.)

Many thanks.

Waving hands

A type in Haskell can be seen as a proposition in logic (it's called the Curry-Howard isomorphism). What Djinn does is that it proves or disproves the proposition. If it manages to prove the proposition it extract a program from that.

Thank you for your time, augu

Thank you for your time, augustss. I think I understand you. Is the following a reasonable way to paraphrase your explanation?

Djinn passes the type to a Hindley-Milner--style type-unifier which, by Curry-Howard, is also a proposition tester. The tester returns a proof of the proposition (if such a proof exists). Djinn then 'maps' this proof into an expression.


P.S. Why did you call it 'Djinn'?

Not quite

There's no Hindley-Milner involved.
The proposition corresponding to the type is passed to a theorem prover, which uses a terminating variant of Gentzen's LJ calculus.

IDE Integration?

Could this be used interactively in an IDE session to create boilerplate for referenced-yet-unspecified functions?

But it does not build!

It takes more than simple make.

Does anyone know what to do and where to get neccessary libraries?

I'd recommend emailing haskel

I'd recommend emailing haskell-cafe or dropping by (the now 200-strong) #haskell on

Missing libraries

Perhaps you were unlucky and caught Djinn in an incomplete state? (I forgot a couple of files). Try a 'darcs pull.

Could this be adapted to find isomorphisms...

...rather than just any function of the given type? For example this paper constructs an isomorphism (in a sense defined in the paper) between trees and 7-tuples of trees (not a trivial task). Could Djinn be adapted so that it too could automatically construct this isomorphism?

It looks deep. I doubt it co

It looks deep.
I doubt it could be automated easily. Besides, Djinn doesn't do recursive types.

The main result isn't that deep

It's basically about proving that if X=X^2+1 (ie. if a Tree is either a leaf or a pair of trees) then X^7=X using some simple axioms about types (in particular no subtraction or division). It's hard work to find the proof manually so it'd be nice to automate it. I'm going to read your code and think about it.

Can it require that all arguments are used?

I'm not familiar with the properties of the LJ calculus that guarantee it will terminate, so maybe someone who is can answer this question: It seems like Djinn tends to ignore its arguments, if it can. If the algorithm were changed to require the use of all inputs, would you have to sacrifice termination?

For example:

@djinn a -> [a] -> [a]
15:40:07  x :: a -> [a] -> [a]
15:40:07  x _ x2 = x2

This is an uninteresting solution. If you required that the solution make use of x1, would the tool still be guaranteed to terminate?


Djinn actually tries to pick a solution which uses as many variables as possible.
Your problem is that Djinn has no knowledge about lists, so it can't cons.

In general, I think that if you require it to use as many variables as possible it would not terminate, because some quetions have an infinite number of solutiuons.

Linear Types

One way to force Djinn into using certain arguments would be to extend the type system to allow for linear types, corresponding to linear logic. But I don't know anything about proof search in linear logic. Though I guess it's going to be a bit more complicated.

Relevant Types

If one just wants to ensure argument to be used (as opposed to be used exactly once), then some type system corresponding to Relevant Logic, could be more relevant.

(Relevant and Linear logic are siblings in the family of Substructural logics)

I think even more than that w

I think even more than that would be needed. One would need a typeful way of guaranteeing that every argument can influence the resulting value along some execution path. So instead of substructural types, you'd need some sort of information flow typing.

Multiple Programs

Is there a way to ask Djinn to, say, generate 10 programs (or just to do so endlessly) with a given type? I wonder how long it would take to "discover" factorial ;~}

Systematic search for lambda expressions

Maybe you are interested in the work by Susumu Katayama previously discussed, though its basis seems not proof-theoretic. Restrictions on type may be different too.

Oleg's in-Haskell version

Oleg reminded us that he had built a version of a term-finder that worked in-place.


For those who have looked into Category Theory, you've definitely come across the Yoneda Lemma, a crucial but simple result. One way to arrive at it and its proof is to view Hom as (→) and naturality as parametricity. The "free theorems" are the the naturality squares. From this perspective the goal is to show a natural isomorphism (for the dual problem, otherwise we need to define a ContravariantFunctor typeclass with cfmap :: (b → a) → f a → f b note the reversal)

f :: Functor f => ((b → a) → f a) → f b
f^-1 :: Functor f => f b → (b → a) → f a
that satisfies f . f^-1 = id and f^-1 . f = id. (*)

I believe that the implementation of these functions is completely determined by the types and if Djinn understands type classes it could automatically create the implementations. If not, it should still be able to handle f. Finding the implementation and proving that it is an isomorphism is a pretty simple exercise. It is easy then to show that you are only using things available and true for any category. If you want to enforce this a bit more, you can replace (b → a) with Arrow arr => arr b a, though that's still not quite a general "arrow". (You'll need the fact, not enforced by Haskell, that a functor f satisfies, f id = id and f (g . h) = f g . f h.)

* <sup> not allowed?