The Next Move in Programming (Livschitz interview)
started 2/13/2004; 4:22:35 PM - last post 2/23/2004; 6:37:34 AM
|
|
andrew cooke - The Next Move in Programming (Livschitz interview)
2/13/2004; 4:22:35 PM (reads: 10083, responses: 40)
|
|
The Next Move in Programming (Livschitz interview) |
I was going to post this under fun, because the Slashdot comments suggested it would have some readers here in fits. But there are some interesting things in there - she talks about explicit state in programs, for example, and, near the start, seems to be suggesting that an Erlang-like approach could be useful. Her observations about programmers being average people is worth repeating, too.
Finally, some of the /. comments mentioned functional programming. Is my memory deceiving me, or is that something that wouldn't have happened 5 years ago?
Posted to general by andrew cooke on 2/13/04; 4:26:57 PM
|
|
|
|
John Skaller - Re: The Next Move in Programming (Livschitz interview)
2/13/2004; 10:44:42 PM (reads: 1135, responses: 1)
|
|
She should have stuck with mathematics I think:
"The Java language, of course, personifies the modern general-purpose programming language with first-class systemic safety qualities. It's a huge improvement over its predecessor, C++"
-- hard to see how a language which uses upcasting instead of having generics is an 'improvement' over C++.
|
|
Sergey Goldgaber - Re: The Next Move in Programming (Livschitz interview)
2/13/2004; 10:58:54 PM (reads: 1129, responses: 0)
|
|
Well, Java has the superior "sandbag architecture". :)
She also seems to confuse strong with static typing. And claims "object-oriented programming allowed developers to create industrial software that is far more complex than what functional programming allowed".
However, I hear she is good at chess.
|
|
James Hague - Re: The Next Move in Programming (Livschitz interview)
2/14/2004; 10:09:04 AM (reads: 1027, responses: 0)
|
|
Perl, Ruby, and especially Python have made functional programming less foreign to many programmers. I used to think the coolest thing about Hope (now there's a forgotten language!) was the native and intuitive syntax for lists and tuples. At the time I was a young 'un who only knew BASIC, Pascal, and 6502 assembly language, and Hope blew my mind. Now data structures like that are ubiquitous in modern languages.
|
|
Patrick Logan - Re: The Next Move in Programming (Livschitz interview)
2/14/2004; 10:26:06 AM (reads: 1021, responses: 3)
|
|
Consider a few common concepts that people universally use to understand and describe all systems -- concepts that do not fit the object mold. The "before/after" paradigm, as well that of "cause/effect," and the notion of the "state of the system" are amongst the most vivid examples... The sequence of the routine itself -- what comes before what under what conditions based on what causality -- simply has no meaningful representation in OO, because OO has no concept of sequencing, or state, or cause.
She makes a good point here, and this applies not just to OOP. We build the concepts of time, sequence, and causation into applications from scratch when they are required by the customer. We have essentially no support for these at the language level.
|
|
Andris Birkmanis - Re: The Next Move in Programming (Livschitz interview)
2/14/2004; 11:28:10 AM (reads: 1010, responses: 2)
|
|
Patrick: We build the concepts of time, sequence, and causation into applications from scratch when they are required by the customer. We have essentially no support for these at the language level.
I tend to think that having metaphysical concepts like this built into the language is not a RightThing (tm).
It is like saying there is just one philosophy, just one culture, just one way to see the world. How that is better than viewing everything as an object?
Disclaimer: having them built-in is perfectly ok for DSLs, of course. General purpose PLs though are called so on purpose ;-)
|
|
andrew cooke - Re: The Next Move in Programming (Livschitz interview)
2/14/2004; 11:45:33 AM (reads: 1007, responses: 1)
|
|
general purpose programming languages should be capable of time travel? sweet! :o)
[on edit - i am being too flippant and brief. what i mean is that those ideas are already in our languages, whether we like it or not. what we need to do is see them more clearly. perhaps.]
|
|
Andris Birkmanis - Re: The Next Move in Programming (Livschitz interview)
2/14/2004; 12:07:52 PM (reads: 1004, responses: 0)
|
|
Andrew: what i mean is that those ideas are already in our languages
It depends what is the metalevel in question. Whether you mean time and causation as applied to execution of your code or to the problem domain. If you are programming a CRM for a time travel company - your domain requires time travel :-)
On a serious side - I've seen too many naive models of time, some of them even assuming complete ordering of events... While this is acceptable for some domains, I prefer not to endorse any particular model as the only right one.
|
|
Patrick Logan - Re: The Next Move in Programming (Livschitz interview)
2/14/2004; 1:55:56 PM (reads: 962, responses: 8)
|
|
what i mean is that those ideas are already in our languages, whether we like it or not. what we need to do is see them more clearly. perhaps.
Please illustrate.
|
|
Patrick Logan - Re: The Next Move in Programming (Livschitz interview)
2/14/2004; 1:58:22 PM (reads: 960, responses: 0)
|
|
While this is acceptable for some domains, I prefer not to endorse any particular model as the only right one.
Yes, and so we need to shine a light on these models and get the better ones into a place where they're easily applied.
|
|
Patrick Logan - Re: The Next Move in Programming (Livschitz interview)
2/14/2004; 1:59:51 PM (reads: 959, responses: 0)
|
|
having them built-in is perfectly ok for DSLs
I have no problem with this. There are some pretty broad domains where these ideas can be applied.
|
|
andrew cooke - Re: The Next Move in Programming (Livschitz interview)
2/14/2004; 3:41:11 PM (reads: 947, responses: 7)
|
|
please illustrate
very few languages let you reverse computations (i know it can be done in some debuggers).
|
|
Patrick Logan - Re: The Next Move in Programming (Livschitz interview)
2/14/2004; 4:50:02 PM (reads: 927, responses: 1)
|
|
very few languages let you reverse computations
The way I interpret the original statement is we need a way to represent time and causation in our applications per se, not when debugging them.
E.g. some notation to represent a state machine, the behaviors associated with the state, but also the *history* of the state machine and the behaviors associated with that history, as well as the elapsed time across events. With current general purpose languages we tend to write these things from scratch.
|
|
Alex Peake - Re: The Next Move in Programming (Livschitz interview)
2/14/2004; 6:41:47 PM (reads: 917, responses: 1)
|
|
very few languages let you reverse computations
Is that not the essence of functional languages - all original forms (can be) still accessible?
|
|
Andris Birkmanis - Re: The Next Move in Programming (Livschitz interview)
2/15/2004; 12:22:16 AM (reads: 874, responses: 0)
|
|
Is that not the essence of functional languages - all original forms (can be) still accessible?
I think modulo GC and other "cheats" allowed to implementation as long as the application is not able to catch them (except through metaprotocol, like checking amount of free memory).
|
|
Andris Birkmanis - Re: The Next Move in Programming (Livschitz interview)
2/15/2004; 12:43:59 AM (reads: 869, responses: 0)
|
|
very few languages let you reverse computations (i know it can be done in some debuggers).
Debuggers are part of an implementation of the language. IMHO you can build a reversing debugger for pure computation in any language (by pure here I mean not having effects outside of language runtime, which we assume is controlled by debugger).
But to reverse computation from a computation itself... Looks like a metaproblem to me ;-)
|
|
Dan Shappir - Re: The Next Move in Programming (Livschitz interview)
2/15/2004; 1:08:41 AM (reads: 858, responses: 0)
|
|
It seems to me that having transaction (ACID) support built into the language is a simple manifestation of the "time notion" you are describing. Transactions make a very clear distinction between past (committed), present (ongoing - not yet committed), and future (not yet began). Obviously, computations that are a part of an active transaction can be rolled back before they are committed.
Also, the idea behind some application servers can be viewed as a desire to simplify the notion of time. They make it possible to code a server application is if it was designed to service only a single client, and ignore the complexities or resource management required to service multiple concurrent clients. Thus, in effect, each client code received it's own independent "time slice".
|
|
Andris Birkmanis - Re: The Next Move in Programming (Livschitz interview)
2/15/2004; 1:47:50 AM (reads: 849, responses: 0)
|
|
Here is the statement I see as most problematic:
[the] programming constructs should ... more closely simulate and resemble the real world
If we fix the vision of the world in the constructs of the language itself, we are not free to change this vision to match the domain at hand. I cannot stress enough how dangerous it is to fix the vision of the world to be used by everyone. Yes, I know, every language is doing that in a way, but most of them only remotely, so a programmer almost always has a chance to free his mind. You can do OOP in Haskell or FP in C++, but it will be much tougher to go against the language that explicitly postulates the design of the world upon you.
The viable alternative is providing a convenient base for defining DSLs (with their corresponding views of the world). The main problem will be interoperability, of course. But if the domains have different views of the world - why would they need to interoperate? :-)
|
|
andrew cooke - Re: The Next Move in Programming (Livschitz interview)
2/15/2004; 2:18:00 AM (reads: 853, responses: 5)
|
|
actually, i guess continuations are a kind of time travel in a program (of the kind i was thinking about).
|
|
Andris Birkmanis - Re: The Next Move in Programming (Livschitz interview)
2/15/2004; 2:50:10 AM (reads: 847, responses: 4)
|
|
actually, i guess continuations are a kind of time travel in a program (of the kind i was thinking about).
Meaning that manipulating continuations may cause not so intuitive causal chains :-)
IMHO the intuition that is broken is not the layman's one, but that of a seasoned programmer. If you teach children from school that continuations are just the opposite of values, they will (probably) develop the needed intuition. After all, the well established concepts of values, functions, etc. are not so intuitive until you learn about them.
What I am trying to say in this thread, that many of PL and SE problems are in fact psychological or philosophycal, not technological.
|
|
andrew cooke - Re: The Next Move in Programming (Livschitz interview)
2/15/2004; 3:19:28 AM (reads: 848, responses: 3)
|
|
i'm not sure where you draw the line between technological and philopsophical/psychological (is it hardware v software?).
there seem to be two different ideas here that i am confusing.
first there's mutable state. this seems to be related to the thermodynamic approach to the flow of time (increase in entropy of a system). this is what alex was addressing (and includes io and garbage collection).
then there's the way in which functions seem to be like trap doors in programs - you can fall through them but not return. this seems to be related to causality and entailment.
[on edit - and it's confusing because monads seem to use the second to deal with the first, while continuations are useful because, i think, the reverse the second but not the first]
there was a paper posted here way back that annoyed me because it seemed to be talking a load of rubbish about physics that had nothing to do with computing (it seemed like the computing equivalent of derrida referring to quantum mechanics). i think it was related to this. i don't know if that means that i am talking rubbish now, or if was talkig rubbish then.
(i am trying to understand the thesis. it is a lot easier than crolard's subtractive logic paper, but still hard. i am working slowly through the introduction.)
|
|
andrew cooke - Re: The Next Move in Programming (Livschitz interview)
2/15/2004; 3:51:52 AM (reads: 840, responses: 1)
|
|
|
Andris Birkmanis - Re: The Next Move in Programming (Livschitz interview)
2/15/2004; 4:49:47 AM (reads: 833, responses: 0)
|
|
When I was reading these papers, I also found this one interesting:
When Can we Call a System Self-organizing?
Suggests how much is in the eye of beholder.
(not to invalidate any of the statements expressed in this thread)
|
|
Andris Birkmanis - Re: The Next Move in Programming (Livschitz interview)
2/15/2004; 4:57:55 AM (reads: 828, responses: 0)
|
|
i'm not sure where you draw the line between technological and philopsophical/psychological (is it hardware v software?).
No, the difference between hardware and software is not related to that. I tend to think of technology as capability to solve problems within some (view of the) world, and of philosophy as establishing this (view of the) world in minds.
For example, the problem of accurate replicating of the human body is a technological problem, but whether this will result in a human or just a pile of cells is a philosophical one.
|
|
andrew cooke - Re: The Next Move in Programming (Livschitz interview)
2/15/2004; 4:58:11 AM (reads: 815, responses: 1)
|
|
sorry, one more post - why is a co-product not a product?
if the dual of a category has the same objects, but with the arrows reversed, why do we suddenly start talking about A+B instead of A*B? Do they not represent the same object?
|
|
Andris Birkmanis - Re: The Next Move in Programming (Livschitz interview)
2/15/2004; 5:14:38 AM (reads: 811, responses: 0)
|
|
why is a co-product not a product?
Being a product is not a property, it's a relation towards some category. So if + is a product in C, it is a coproduct in Cop.
Arrow-heads in C are arrow-tails in Cop, but this does not imply the heads and tails in C are the same, right?
|
|
andrew cooke - Re: The Next Move in Programming (Livschitz interview)
2/15/2004; 5:31:14 AM (reads: 811, responses: 0)
|
|
in reply to explanation about (co-) products
ok, but then i don't see how you avoid true=false.
i understand the terminal/initial object to be the empty product/co-product. in a deductive system that's identified with true and in a co-deductive system with false.
ah. no, ok, i get it. in the co-deductive system the objects "are interpreted as negated formulae". so you get an extra negation. i need to be more organised in separating interpretation from category, i think.
thanks.
in reply to paper about entropy:
in physics you have a clear reference - you're talking about quantum mechnical states. if you choose your states willy-nilly you get the wrong results (wong in that your experiments contardict your theory). the problem is that people define entropy on non-physical systems without being clear about how they're carrying across all the rest of physics.
|
|
Patrick Logan - Re: The Next Move in Programming (Livschitz interview)
2/15/2004; 9:39:40 AM (reads: 765, responses: 2)
|
|
Aside: I *knew* I should have started this discussion about time in another forum. Y'all are too geeky. 8^)
Whether you mean time and causation as applied to execution of your code or to the problem domain.
The problem domain, the problem domain!
|
|
Ehud Lamm - Re: The Next Move in Programming (Livschitz interview)
2/15/2004; 9:54:55 AM (reads: 764, responses: 0)
|
|
The problem domain, the problem domain!
But that's the problem, isn't it? Different domains call for different notions of time.
Is this obvious, or would you prefer I give some concrete examples?
And hey, we are not too geeky. Not by a long shot...
|
|
Mario Blažević - Re: The Next Move in Programming (Livschitz interview)
2/15/2004; 11:28:40 AM (reads: 746, responses: 1)
|
|
It seems to me that having transaction (ACID) support built into the language is a simple manifestation of the "time notion" you are describing. Transactions make a very clear distinction between past (committed), present (ongoing - not yet committed), and future (not yet began). Obviously, computations that are a part of an active transaction can be rolled back before they are committed.
The only language I know that fits the ACID requirements is Oz. It allows forking "computation spaces" (i.e. transactions) which can be merged (i.e. committed) or discarded (i.e. rolled back). I don't know just how significant this language feature would be in everyday programming tasks. Today it's considered absolutely necessary for any serious DBMS to have ACID, but one reason for that is that all data in a database is "global" in a way. On the PL level, you fight against failed computations not by "rolling back the time", but by localising the effect of fallible computations so they don't affect your global data.
So on a more philosophical level, you could say that PLs let you control space instead of time.
|
|
andrew cooke - Re: The Next Move in Programming (Livschitz interview)
2/15/2004; 1:31:04 PM (reads: 719, responses: 0)
|
|
part of the framework that ejbs provide is transactions (i know it's not in the language, and it's a mess - doesn't allow nested transactions at the moment, i believe - but it's an important feature in j2ee servers).
[sorry - ejb = enterprise java beans]
|
|
andrew cooke - Re: The Next Move in Programming (Livschitz interview)
2/15/2004; 1:58:47 PM (reads: 715, responses: 0)
|
|
The problem domain, the problem domain!
i was thinking that if time flow within languages was more epxlicit, it might be easier to fit them to the problem domain, but i'm not sure why or how.
evidence against that would include reactive systems, in which (as far as i know, not having read much about them or used them) a dsl (typically in haskell) is used to express/trigger/evaluate functions defined wrt to time (paul hudack and conal elliott are two authors that spring to mind). there's a related syntax that makes things nicer (arrow notation?) that, i think, is appearing in the next ghc release (iirc it allows you to handle the time variable implicitly), but it doesn't really "leverage" the flow of time within the program. you could just as easily have time runnig backwards, or in a loop, i would have thought.
|
|
Stuart Allie - Re: The Next Move in Programming (Livschitz interview)
2/15/2004; 4:56:20 PM (reads: 676, responses: 1)
|
|
Patrick: We build the concepts of time, sequence, and causation into applications from scratch when they are required by the customer. We have essentially no support for these at the language level.
When I read the cmments in the article to which Patrick refers, I thought of things like concurrency in Erlang, tasks in Ada (I've finally started looking at Ada in detail, Ehud, it's all your fault :) , the :before and :after slots in CLOS, and all the various little DSLs that implement things like state machines. Personally, I'd love to see language-level support for all these kinds of things.
Recently, I've been working on a large-ish simulation of a power generation and distribution network, and doing it in a standard imperative language consists of vast amounts of code of the form "if we are in this state, and it's not the beginning of the day, and this power station hasn't broken down, then we do this thing. And then, if this happens, ...." If the language supported (time) ordering of events, and a nice way to define state machines, this would be a lot easier.
I guess I'm just going to have to design a DSL for this project :)
|
|
Alex Peake - Re: The Next Move in Programming (Livschitz interview)
2/15/2004; 5:10:07 PM (reads: 671, responses: 0)
|
|
very few languages let you reverse computations
The way I interpret the original statement is we need a way to represent time and causation in our applications per se, not when debugging them
Using ML as a concrete example, and coding carefully to preserve that which we wish to preserve -- by not using mutable variables such as refernces and arrays, we gain reversible computations (unlimited undo) for (almost) free (well there is a memory issue). By keeping references to prior bindings, we can step back in time to the begining.
|
|
Isaac Gouy - Re: The Next Move in Programming (Livschitz interview)
2/15/2004; 5:29:39 PM (reads: 670, responses: 0)
|
|
simulation of a power generation and distribution network... in a standard imperative language
There's got to be a better way!
As usual it depends what you're actually trying to do - in this case, the purpose of the simulation. Depending on that, it seems like you should be able to find a more declarative approach.
Have a look at the references to CLAIRE.
|
|
Stuart Allie - Re: The Next Move in Programming (Livschitz interview)
2/15/2004; 6:09:46 PM (reads: 667, responses: 1)
|
|
(Me:) simulation of a power generation and distribution network... in a standard imperative language
(Isaac:) There's got to be a better way!
I know. I've just "inherited" this project - it's currently 20,000 lines of pascal! Fortunately, one of the things I'm doing is looking at options for redeveloping the system. Since the users are mainly engineers, the end result has to be something that is programmable and extendible by non-programming specialists.
As usual it depends what you're actually trying to do - in this case, the purpose of the simulation. Depending on that, it seems like you should be able to find a more declarative approach.
I am thinking hard about how to go to a more declarative way of doing things. The hard part is doing this while retaining enough flexibility in the system for the end-users to be able to make significant changes to the way the simulation works.
Have a look at the references to CLAIRE.
Thanks for the tip, I'm looking at their web pages now.
If anybody reading this has any suggestions or links regarding designing a DSL for this kind of thing, they'd be much appreciated.
|
|
andrew cooke - Re: The Next Move in Programming (Livschitz interview)
2/16/2004; 3:16:09 AM (reads: 588, responses: 0)
|
|
|
Stuart Allie - Re: The Next Move in Programming (Livschitz interview)
2/16/2004; 2:29:20 PM (reads: 485, responses: 0)
|
|
Andrew,
Thanks for that. I came across frp a while back but had completely forgotten about it. This is most interesting and useful stuff.
|
|
Luis Castro - Re: The Next Move in Programming (Livschitz interview)
2/17/2004; 7:28:58 PM (reads: 282, responses: 0)
|
|
Maybe some introspection capabilities to analyze the current stack?
|
|
John Skaller - Re: The Next Move in Programming (Livschitz interview)
2/21/2004; 10:31:31 AM (reads: 138, responses: 0)
|
|
Andrew asked:
"if the dual of a category has the same objects, but with the arrows reversed, why do we suddenly start talking about A+B instead of A*B? Do they not represent the same object?"
Categories do not have objects. Not really. They're a temporary artifice whose sole purpose is to simplify the axioms, and they're immediately identified with the corresponding identities, abstracting them away.
A product of two objects A and B is not an object C, its a pair of projection functions. In the dual category, the corresponding arrows are the coproduct, and clearly aren't "the same" since they go in the opposite direction :-)
|
|
Frank Atanassow - Re: The Next Move in Programming (Livschitz interview)
2/23/2004; 6:37:34 AM (reads: 116, responses: 0)
|
|
John: Categories do not have objects. Not really. They're a temporary artifice whose sole purpose is to simplify the axioms, and they're immediately identified with the corresponding identities, abstracting them away.
This is not a useful way of looking at things.
In most algebras, one has many choices as to which operators should be primitive, and which should be derived. Each such choice is called a presentation. In classical logic, for example, one can take disjunction and negation as primitive, and define conjunction and implication in terms of them; or one can take conjunction, implication and false as primitive, and define negation and disjunction in terms of those.
Does that mean that `there is no such thing as implication, not really'? Or that `there is no such thing as negation, not really'? Certainly not. It just means that there are multiple ways of presenting what amounts to the same thing.
In a category, the objects and identity arrows are in a one-to-one correspondence. It is true that you can give a `one-sorted' presentation of categories by saying that the domain of an arrow is an identity arrow, and similarly for codomain. That is one presentation. In the standard presentation one says there is an identity arrow for every object, and an object for every identity arrow. That is another presentation. Presumably (though I've never seen such a thing) you can also give a third presentation which does not mention any identity arrows at all, in which composition is defined so that one can compose an arrow with an object as well as another arrow.
To say that one of these is `real' than the other is not useful, since they all have the same model: a category. Of course, one presentation may be more useful than another for certain applications...
A product of two objects A and B is not an object C, its a pair of projection functions.
A product of two objects is a pair of projection arrows, plus a tupling function (on certain hom sets).
Back to andrew's question:
andrew: if the dual of a category has the same objects, but with the arrows reversed, why do we suddenly start talking about A+B instead of A*B? Do they not represent the same object?
Yes, but in addition to what other said above, consider that the category may have both products and coproducts; then using the same notation in the opposite category would lead to confusion. Furthermore, neither one of a category and its opposite is more fundamental than the other; that symmetry makes it impossible to `normalize' and pick one dual over its counterpart, unless you make some arbitrary choices elsewhere.
in the co-deductive system the objects "are interpreted as negated formulae". so you get an extra negation.
You have to be very careful about this kind of reasoning. Categorical duality can clearly be interpreted as a kind of negation, but it is probably not the kind of negation you are used to.
For one thing, in classical logic the dual of a formula is another formula of the same sort; but if you want categorical duality (as is) to model negation in a category regarded as a deductive system, then you have to accept that the dual of a formula (object of C) is not a formula, but rather a `co-formula' (object of Cop). If C is cartesian, then, say, you cannot form the conjunction of a formula with co-formula, or anything like that. To do that, you need to have a (contravariant) endofunctor on C so that the opposite of a formula is another formula, and not a co-formula, and there have to exist a number of isomorphisms in C satisfying various conditions.
|
|
|
|