SPARKAda

A SPARK program has a precise meaning which is unaffected by the choice of Ada compiler and can never be erroneous.

From the examples in the chapter, I thought it looked surprisingly simple to use - comparable to adding contracts in DbC, for example. I guess the analysis requires a little more effort?

This has been mentioned only in passing by Ehud so I hope it's worth a post of its own. And I'm amused by the idea that Ada is a sloppy, ill-defined language... ;o)

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

I'm not impressed

"All constraints (such as the size of arrays) should be static."
"some of the things that were NOT included were the ability to use side effects in functions, multitasking, and dynamic memory allocation."
This is ridiculously limited for convenient programming of non-trivial tasks.

Amother issue: I guess "can never be erroneous" means that it conforms to a formal specification. How does one know that the specification is correct wrt. the intent?

Yet another issue: how does it ensure that the stack used for function calls will fit in memory?

Re: I'm not impressed

This is ridiculously limited for convenient programming
> of non-trivial tasks.

You may think so; but, a number of real-life, critical projects of significant size have been successfully written in the language. Perhaps you are being confused by terminology? For example, the absence of function side effects doesn't prevent the use of subprograms that modify their parameters (or global data); however, these subprograms are procedures, not functions. In SPARK, procedures are used to change the world and functions are used to inspect the world. Procedures and functions are commonly confused because C doesn't make this distinction.

Amother issue: I guess "can never be erroneous" means
> that it conforms to a formal specification.

Erroneous doesn't mean that, it has a specific meaning w.r.t. the Ada language. Erroneous essentially means "breaks the rules of the language in a way that the compiler can't detect and warn you about". For example, a program that gave different answers depending on the order of evaluation of subexpressions chosen by the compiler would be erroneous (and rather dangerous). SPARK programs are never erroneous in that precise technical sense: e.g. compiler implementation freedoms cannot change the meaning of a SPARK program (one of the reasons function side effects are banned) and all violations of SPARK rules are always statically detectable.

Clearly SPARK cannot ensure that every program written in it will do exactly what you wanted, that would require mind reading not language design! What it does do is make it possible to show, to a very high degree of confidence, that the program has certain specific properties. The designer's job is to choose the right properties.

Yet another issue: how does it ensure that the stack
> used for function calls will fit in memory?

In the absence of recursion (prohibited in SPARK) the maximum stack depth required is statically determinable prior to execution. If there is enough room for it to begin with then there will be enough for ever (because there is no dynamic memory allocation).

Terminology

Perhaps you are being confused by terminology? For example, the absence of function side effects doesn't prevent the use of subprograms that modify their parameters (or global data); however, these subprograms are procedures, not functions. In SPARK, procedures are used to change the world and functions are used to inspect the world. Procedures and functions are commonly confused because C doesn't make this distinction.

Count me as one of those confused by this terminology, but not because of C. Thinking in the functional language context, I assumed that eliminating "the ability to use side effects in functions" meant an entirely side-effect free language, forgetting that Ada makes that quaint procedure/function distinction that was so popular in the pre-theoretic imperative languages of the '80s. ;o)

I think it would be more accurate to say "procedures and functions are commonly confused [in Ada and/or SPARK] because this distinction is not standard terminology across languages". The details and implications of this distinction tend to vary between languages which make it, and these days, most languages don't make the distinction in this way at all. Functional languages in particular don't usually make it. In imperative languages, it's become more common to make related distinctions using qualifiers such as the const keyword in e.g. Java and C++.

Also, the terminology is at odds with that of PL and computing theory, where a distinction is commonly made between functions as abstract mathematical entities, and the procedures which implement those functions. "Procedure" is used here in a similar sense to the term Turing used, "effective procedure". (Church similarly distinguished between lambda calculus formulae and the mathematical functions which they "lambda-define".)

Scheme language texts tend to use the term "procedure" for this reason, mainly reserving the term function for its mathematical sense, although most other functional languages seem to be more cavalier about it and commonly use "function" as the term for procedural entities in their language, which of course is common usage amongst programmers anyway.

Anyway, SPARK does indeed sound more interesting now that the terminology problem has been cleared up. :)

it (the array limitation) is

it (the array limitation) is no more limiting than fortran 77, as far as i remember.

as to the rest - i don't know, but if it's obvious to you how to do better i guess you could make a lot of money...

correcting myself a bit: wasn

correcting myself a bit: wasn't there a paper recently (from oleg, i guess) that did static checking of array sizes?

Eliminating Array Bound Checking

Maybe you're thinking of Eliminating Array Bound Checking through Non-dependent types posted to the Haskell mailing list? There was also a LtU discussion.

My take

I understand why people use SPARK, but I don't like the restrictions, obviously. To add what you mentioned, I think SPARK doesn't handle Ada's OOP features (but I am not, I'll hvae to check).

Personally, I think SPRAK removes all the interesting features of Ada... But when you want static analysis, including things like total memory consumption (including max stack depth) it seems like you must give something up.

Notice that the SPARK approach isn't just about what's theoretically provable. They want to ensure you write code that'sd easy to understand and maintain, so they even restrict the use of features that may in fact be statically analyzed.

Some info about SPARK

I thought I'd offer some up-to-date information about SPARK, based on the comments and questions so far:

1) Applications. SPARK is principally designed for use in hard real-time, embedded systens. Most applications are also safety- or security-critical. In that context, some of the language restrictions might make more sense, where typically very conservative subsets of any language are used to meet, for example, regulatory requirements.

2) SPARK is amenable to the static analysis of worst-case stack usage and timing. This is compiler- and target-dependent, but we've done it many times.

3) OO. SPARK supports a subset of the Ada95 "tagged
types" OO facility. We support single inheritance of types and methods, extension of types, and overriding of methods. SPARK _doesn't_ support polymorphism or dynamic-dispatch ("Class-Wide Operations" in Ada terminology.) Some OO enthusiasts find this unacceptable, but we've found it to be a surprisingly useful subset! We _do_ support the verification class/subclass compatibility via generation of the appropriate verification conditions for the Liskov
Substitution Principal.

4) More recently, we put a subset of tasking back in.
This embodies the Ada95 "Ravenscar Profile" which is
useful for hard real-time. It's very light and predictable. We're working on generics right now...

5) "Erroneous" is an Ada term that means "Your program
has stepped beyond the bounds of the language definition." Essentially - all bets are off! The Ada
LRM defines a fair few of these - the most common being
reading an uninitialised variable and an unchecked buffer overflow. SPARK eliminates _all_ of these cases
entirely statically.

Hope that helps,
Rod Chapman, Praxis High Integrity Systems
(Disclaimer: yes - I am one of the designers of SPARK)

SPARK _doesn't_ support polym

SPARK _doesn't_ support polymorphism or dynamic-dispatch ("Class-Wide Operations" in Ada terminology.)

I should have made clear that this is what I had in mind. I wrote about it not so really as criticism, though personally I find this regrettable.

we put a subset of tasking back in. This embodies the Ada95 "Ravenscar Profile".

I really should post more about Ravenscar...

And let us know when generics are in. I am a generics fan...

SPARK and polymorphism

If you, or anyone else, could tell us what the rules
ought to be, then we'd be happy to implement them!
(The main conflict is how to _statically_ analyse a
_dynamic_ dispatch - clearly something of a dilemma...)

Many readers will now think "Aha, but a dynamic dispatch is just like a case statement...", but that's not actually the case from an analysis point of view - the crucial thing is _when_ the analysis can be performed. The "treat it like a case statement" approach only works if you have the linkable closure of a complete program (aside: which tool in C++ or Ada builds the dispatch table? The linker!). This is a disaster from the point of view of constructive static analysis.

Eiffel, of course, avoids all this by verifying the contract dynamically _after_ the dispatch has occurred...a neat trick, but a little too late for the safety critical community to stomach... :-)
- Rod Chapman, SPARK Team.

Dynamic Contract Checking

You are, of course, correct. In my Ada-Europe paper about DbC I discussed the issue and explained why I think the language should only mandate dynamic checking (unless supressed by a pragma), while static analysis be performed by supporting tools (e.g., SPARK) since we still don't know how best to do it.

As regards c-w routines, it might be possible to prove a forall x theorem (i.e, regardless of the actuall type the routine works ok). The performance issues for dynamic dispatching are something else altogether, which I haven't studied.

If I am not mistaken the consensus when I took the SPARK tutorial was that c-w wasn't really of interest to SPARK users (i.e, mission critical, hard real time software).

Other high assurance languages?

I'm very intrigued the whole concept, and I'd like to see how much the limitations on recursion, etc. get in your way. But there doesn't seem to be a freely available Spark compiler. Can anyone point me in the direction of another high assurance language?

Hume

Mentioned Hume, a functional programming language, which attempts static guarantees of resource usage. May not be what you're looking for, but the papers cover some of the issues involved in this area.

Compiling and Analuzing SPARK

A common mis-conception. To compile SPARK, you simply
use a standard Ada compiler - you've heard of
GCC right? (The Ada front-end for GCC is amazing,
and a work-of-art from a compiler-construction
point of view...)

To _analyze_ and verify SPARK, you need the SPARK
toolset. This is available with the SPARK Textbook,
or is free to academic faculty. If you want
the supported, professional version of the toolset,
then it costs money...
- Rod Chapman

there was at least one discus

there was at least one discussion at the old site, but i can't find it in chris's list.