The DSL, MDA, UML thing again...

Simon Johnston (IBM/Rational) on Domain Specific Languages and refinement:

My position is that the creation of domain specific languages that do not seamlessly support the ability to transform information along the refinement scale are not helpful to us. So, for example, a component designer that is a stand alone tool unconnected to the class designer that provides the next logical level of refinement (classes being used to construct components) is a pot hole in the road from concept to actual implementation. Now, this is not as I have said to indicate that domain specific languages are bad, just that many of us in this industry love to create new languages be they graphical, textual or conceptual. We have to beware of the tendency to build these disjoint languages that force the user to keep stopping and jumping across another gap.

I am not sure I want to go back to the argument between IBM and Microsoft about this stuff, but I think the notion of refinement is important from a linguistic point of view (e.g., embedding, language evolution, type systems, reasoning etc.)

But can refinement of the sort discussed here work in practice, or does it depend on language-design savvy architects? Maybe the difference between IBM and Microsoft is that the IBM approach assumes all modelling knowledge will come pre-packaged, being designed by modelling professionals and embedded in tools, where as the Microsoft approach assumes more design expertise from users?

Feel free to correct me, I am really unsure where all this is heading anyway...

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Specialization?

Was wondering whether DSL is meant to promote specialization in the software process?

(see related article on Separating job functions).

The problem with software, to paraphrase Hawking, is that "It's software all the way down".

paraphrase Hawking?

What's he got to do with it? The anecdote isn't about him, or is there more than one?

Just gisting

The original quote is about it being Turtles all the way down. For some reason that reminds me of the difficulty of seperating functions within software development - no matter where you draw the lines, it's still software on both sides of the division.

Of course it's about turtles, it 's just not from Hawking

Eddington's the guy, IIRC. Maybe Russell (but I think that's the one with the punch line "Oh alright, I thought you said 100 million years"). Definitely not Hawking.

Hawking popularized it

The opening of "A Brief History of Time" starts with the turtle story.

Darn it...

Ok, right. Now I see it.

But to make things even more frustrating Hawking just writes "a wellknown scientist (some say it was Bertrand Russell)". No wonder I wasn't sure who it was.

russel or feynman

I want to remember it was feynman, perhaps recounted in his "pleasure of finding things out".

However, tha intarweb is siding with the russel hypothesis.

Similar problem

no matter where you draw the lines, it's still software on both sides of the division.

This reminds me of a coworker who likes to pull my chain saying that "it's all syntax." If you formally define semantics, you are simply using syntax again etc. I am sure everyone here understands what we mean when we say semantics aren't the same as syntax, but try arguing with someone who is smart and tries to resist the distinction. It's quite frustrating :-)

Syntax vs. semantics

This reminds me of a coworker who likes to pull my chain saying that "it's all syntax."... try arguing with someone who is smart and tries to resist the distinction. It's quite frustrating :-)

If he can argue that, he could probably equally convincingly argue that "it's all semantics." A little cleverness is a dangerous thing! :)

I think most researchers understand that the barrier between syntax and semantics is porous. Like rationals, an object typically admits many ways of factoring itself into syntactic and semantic parts, including ways in which one or the other is trivial. But (and this is very vague but) I think you can only find factorizations where the semantic part is trivial if the object in question is `computable'. A way to attack the "it's all syntax" argument is to point out the chicken-egg problem between, for example, set theory and logic; the same argument works for "it's all semantics." (We don't know that set theory is consistent, for instance.)

In programming languages, there is a very strong tendency to factor a language so the syntactic portion is context-free, both for historical reasons and because we have decent tools for parsing context-free grammars. I think this fact confuses a lot of programmers into thinking the syntax-semantics barrier is immovable.

In the calculus I'm working on now, my terms have a grammar which is not context-free, and I think it's quite nice that way, but I am considering refactoring it so that the context-sensitive conditions turn out as typing rules, just because people find context-sensitive grammars `weird'.

Semantics vs. Syntax

I read that semantics is the meaning while syntax is the representation.

To communicate meaning though, you have to represent it. So to specify the semantics, you have to write something which would have a syntax.

I'm guessing a simple example of this is prefix/postfix/mixfix syntax. Underneath the parameter ordering is a function with some semantics. For example, integer addition is integer addition whether you're on a RPN HP calculator, in Scheme, or doing the usual infix representation on paper (excluding space/representation limitations).

Overloaded Semantics for 'Semantics'

Last week or so I castigated myself for not clearly distinguishing between semantics in the usual PLT sense and what I called "intentional PL semantics", that is, the meaning of the program from the point of view of the programmer/user,
Now I don't feel so bad, as I've since noticed that this is quite common. ;-)

For example, integer addition is integer addition whether you're on a RPN HP calculator, in Scheme, or doing the usual infix representation on paper

I think this is a nice example of the distinction I want to draw. The intentional semantics of all of these are the same, though the PLT semantics may not be.

The reason is that the latter type of semantics must have well-defined formal properties to be considered adequate, and formal details matter.
Some of the different cases you mention might in fact have different formal properties.

For example, the various computer or calculator implementation are actually going to do addition modulo some value (depending on the width integer storage).
This value will vary from case to case and therefore they are not equivalent formal systems.
On paper we normally don't assume any roll-over, which makes it formally different again.

Part of the reason that it is hard to distinguish between formal syntax and formal semantics is that they both must be computational in nature, so they are the "same type of thing" in a sense.

Intentional semantics (and general semantics) has no such limitation (that we know of yet), and finding ways to map it onto more formal semantics is 99% of the challenge of software development.

Slight mis-reading

Last week or so I castigated myself for not clearly distinguishing between
semantics in the usual PLT sense and what I called "intentional PL semantics" ...

Good lord, man!! Intellectual honesty is important and all, but I think you're being way to hard on yourself!

Oh, wait ...

You said castigated?

Uhhh ..... never mind ...

: )

[With apologies to Gilda Radner, as Emily Litella, may she rest in peace.]

Uh, why not?


If thou didst put this soure cold habit on
To castigate thy pride, 'twere well.

Can of worms

I'm guessing a simple example of this is prefix/postfix/mixfix syntax. Underneath the parameter ordering is a function with some semantics. For example, integer addition is integer addition whether you're on a RPN HP calculator, in Scheme, or doing the usual infix representation on paper (excluding space/representation limitations).

Nice try. :-) This is a good example of exactly what's being discussed above. Lambda calculus provides a simple example of a case where integer addition (along with anything else that's computable) can be implemented as nothing but a series of (arguably) purely syntactic transformations.

If you're not familiar with this, just imagine how the first cavemen might have done addition: using a "syntax" involving a number of stones. To add one to a number, represented by a pile of stones, you perform the syntactic transformation of adding a syntactic token (a stone) to the pile. You can perform any addition this way, if you have enough time, and enough stones. For such cavemen, the actual act of addition would have been entirely syntactic. Assigning a meaning to the resulting pile of stones introduces semantics, but that's only after the answer has been arrived at, and only because it involves translating between systems.

What about what?

I can't tell if you're demonstrating me wrong. You've just done two implementations of the natural numbers. How is it that you know they are the same? They share the same semantics!

Have you ever written a program with perfect syntax and bad semantics? Lots of people do. That's why they crash.

If you're pulling the article in, then it sounds very much like that people are making a syntax and have no way to map to another syntax while keeping the semantics of programs. It seems that most programmers are so tied up in syntax and representation (rocks vs. numerals) that they aren't thinking about the meaning of what they are writing.

Still overloaded

I can't tell if you're demonstrating me wrong. You've just done two implementations of the natural numbers. How is it that you know they are the same? They share the same semantics!

As Marc Hamann pointed out, the word "semantics" is being used in two different senses here. What Marc called the "usual PLT sense" refers to an aspect of a language's definition that is (usually) necessary in order to determine the result that any given program will produce. However, as Frank pointed out, there are typically many ways of factoring something into syntactic and semantic parts. The point of the examples I gave was to show cases where evaluation operations are entirely syntactic, which illustrates an aspect of the point that Ehud was referring to. Incidentally, it also demonstrates that the PL semantics of integer addition may be different between different systems.

The sort of semantics you seem to be referring to, which Marc called "intentional PL semantics", are another matter. That's what I referred to as "translating between systems". Re-reading your original post, you mentioned "communicating meaning", and seem to be focusing on that communication aspect, as opposed to the way in which a result is arrived at. However, communicating meaning can't be done without first dealing with the "internal" PL theory semantics of the languages in question.

To relate this to your prefix/postfix/mixfix example, the "function with some semantics" that lies beneath the syntax first has a semantics in the PL theory sense. This semantics may be different for each of the different systems that implement integer addition, if the systems work in different ways. If the semantics were the same for each of them, they wouldn't really be different systems. To demonstrate that two different systems implement the "same" operation, you'd have to prove the necessary correspondence between their respective semantics.

In your example, you may have intended to focus on a case where the PL semantics were in fact identical in each case, and only the syntax differed. But the fact that any given semantics can support multiple syntaxes doesn't tell us much about the syntactic/semantic divide that Ehud referred to.

If you're pulling the article in, then it sounds very much like that people are making a syntax and have no way to map to another syntax while keeping the semantics of programs. It seems that most programmers are so tied up in syntax and representation (rocks vs. numerals) that they aren't thinking about the meaning of what they are writing.

Again, you're not talking about PL semantics here, which makes things sound a lot easier than they really are. You'd be absolutely right if all languages were simply syntactic layers over some universal language with a single common semantics. Alas, that's not the case. The problem is that languages have (PL) semantics which are non-trivial to map to other languages with different semantics.

Couple of more links?

TheServerSide as an opionion on Domain Specific Languages picking up steam? which points to an interview with Sergey Dmitriev.