Why Are ALL Programming Languages So Low Level?

Memory allocations and deallocations, threads, locking, execution control (such as loops, function calls, conditionals), etc... these are are machine specific features. Why do we not have a high level language that goes beyond these things? ALL current languages are low level even if the programming community tries to say otherwise. Handling hardware is low level.

Why do languages even touch execution which is a technique used to get around current hardware limitations? What will happen when processors can execute seemingly infinite instructions at the exact same time? Execution control will no longer exist. More to the point, mixing execution control and data manipulation all in the same statement is a hard coded custom solution. You cannot reuse components that have been welded together in this fashion. Why do languages still use this archaic form of programming that can never lead to reuse?

There is a solution, but I'm just curious why the lack of will to move forward. Haven't you ever thought it strange that you can only have unidirectional control statements within functions? Why the skew? Have you not ever noticed that sending messages between computers doesn't work well in an RPC fashion because the other machine is not required to respond? So don't you think that perhaps there might be something wrong with the entire concept of functions and procedures in programming languages? Is not the difficulty of having multiple returns values a sign that something is seriously wrong?

So my main question is: Why are ALL programming languages so low level and more generally obsolete for general purpose use when all the signs are impossible to avoid? Why the refusal from language designers to move forward? Can they really not see the solution?

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Why are ALL such criticisms so ill-founded?

Please show me one part of Haskell outside the IO monad that has a thing to do with how the hardware works... likewise for logic languages.

Function calls aren't machine-specific, btw. They'd been in mathematical use for a long, long time before anyone built any hardware.

Haskell

I was just thinking about Haskell. There's no manual memory (de)allocation, concurrency's expressed through composable memory transactions, there are no loops beyond what you build out of recursive functions, the compiler handles evaluation order, types are expressed as sum-of-product rather than bunches of bits, and you can't update arbitrary memory locations. Seems like that answers most of the OP's criticisms.

All?

I don't think you can make such sweeping generalizations. Many languages don't have to deal with memory allocations or locks. Many languages can have functions with multiple return values just fine. Are you just talking about the mainstream languages like C++ and C#?

Yes, 100%

Can you really all not see it? I thought it was obvious, but that no one wanted to talk about for some unknown reason. In this topic, I'm asking what that reason is. To bring this reason why we're not moving forward into the limelight. But is this a concensus? Does everyone really not see what I'm talking about? If that's the case, then the situation is far, far worse than I thought. This would be a truly horrific state of affairs if true.

Well, can you explain

Well, can you explain exactly what you mean? To me it seems functional, constraint or logic programming languages will solve your problems.

What languages do you know?

control freaks

I find many programmers to be control freaks.

Definition. A control freak is a programmer who freaks out if he/she does not know the control flow of a program.

Example. Some programmers comment on lazy languages (or even OOP languages) with "what is the control flow? this is so confusing". I am sure everyone here has encountered a fair number of these.

Popular programming languages will stay low-level as long as a large body of programmers remain as control freaks.

Fundamental problem

I think that this is a fundamental problem and unlikely to go away. IOW, I think that all competent programmers are control freaks in some sense. I think that the "control freaks" you are referring to just don't understand the control flow of some language. Space and time consumption follows from control as well as whether evaluation terminates or not. This applies to all programming languages (whether imperative, lazy/strict functional, logic, or ...). If you don't understand the control flow of your program and are unable to control it, you are doomed. One, essential, part of understanding control flow is that you need to understand what parts of a programs control flow you can safely ignore and what parts you need to control and to what degree.

42

There is a solution, but I'm just curious why the lack of will to move forward.

Perhaps if you told us what solution you have in mind we'd be in a better position to tell you why it hasn't been adopted, or point you to places you may not know about where it has been adopted.

I have no idea what you're talking about

I'm a mathematician who has no interest in hardware, nor low level programming. A lot of the things you rail against, however, are nothing to do with low level programming. Functions, for example, are derived directly from mathematics and date back to Leibniz or earlier. That a function has a single return value is a matter of mathematical purity, and has little or nothing to do with "low level programming" or hardware. From the mathematical view a function is a binary relation over a domain and codomain. Thus it relates an element in the domain with an element in the codomain. In this sense the question is not "why does a function only return one value?", but rather "why does a function accept more than one argument?". The answer is that it doesn't. This is made explicit in, for example, type signatures of functions in Haskell (where functions only ever take a single argument and higher order functions are used to create "functions with multiple arguments"). Alternatively you can view functions that take multiple arguments as functions that take a single tuple as argument (which, from a mathematical perspective, is exactly what they are). It is not, then, an issue of functions taking multiple arguments, nor returning multiple values, but a question of what the domain and codomain of the function are. You want a function that returns ranges of values? Choose a codomain of sets, or tuples.

There are plenty of languages willing to take an approach that has nothing to do with hardware. Many of the functional languages have a very pure mathematical foundation - try looking at Haskell or Clean. Then there's algebraic languages: nothing but math there - try looking at OPAL, or Maude, or the equally math oriented Charity. Then there's constraint programming languages like Oz, or Prolog. If you can explain why none of those programming languages provide anything like what you're looking for then I might have a better idea of what it is you're trying to get at - though it would be more helpful if you could, instead, give some explicit examples of what it is you want to see rather than me working by process of elimination.

Mathematical purity?

That a function has a single return value is a matter of mathematical purity

What is this purity you're talking about? Isn't this just a conscious design choice to make proofs simpler?

When I listen programmers

When I listen programmers talking about "purity" I'm tempted to recommend them a psychoanalyst.

What is this purity you're

What is this purity you're talking about? Isn't this just a conscious design choice to make proofs simpler?

By purity I mean exactly that: an effort to keep the mathematics clean, and thus simple (and able to be reasoned about and proved). Is there something else I was supposed to have had in mind?

No

I'm happy with your answer. I guess the general tone of this post made me jump at the use of a word which, in other's mouth, could imply something different.

And then there's Q

Off topic

I know this is off topic, but since I don't understand the topic, I was curious to what others here thought about Q. I have the impression that it is somewhat like Haskell (some of the syntax) meets Lisp (the special forms, quotes, and splicing). Any comments?

Q is Interesting

I've played around with Q (read the documentation and wrote a couple of toy programs) and found that it is a nice functional language that will be more familiar to Unix/C programmers due to it's syntax and vocabulary than Haskell or many of the other functional languages talked about on this site. It has extensive and interesting metaprogramming features, and is well suited to audio, image and video work with it's wide range of ported media libraries. It also has an interesting REPL, that combines the language with a bit of Unix shell, making it again more familiar to those that are tied to Unix.

There is a Gentoo ebuild I put together for Q available at the Gentoo Bugzilla for those that are interested in trying it out.

There is no Turing-like really high-level programming language.

I suspect mr Vorlath wants to say that there is no programming language that is really high-level and does not require you to have an IQ of 180 in order to be able to understand/use it.

I sympathise.

Mainstream programming language vendors are not so much interested in state-of-the-art PLs; they are only interested in the $$$ that contracts or lock-ins bring.

On the other hand, the academia has produced excellent programming languages that you have to be a mathematecian to handle them. Of course these excellent programming languages can never become mainstream...programming is a commodity, not a science, for the majority of cases.

What I'm looking for

I'm not asking for people to provide something that I'm looking for. I'm looking for the reason why people don't see what I'm talking about. If you don't see it, then ignorance is a perfectly acceptable reason. There's no shame in that. However, there seems to be a whole lot of defensiveness on functional languages and I'm not sure why.

I'm sure FP is great for what you're doing. I'm not arguing that. But the function is an execution model. Just because it's math doesn't make it high level. Recursion is an all too apparent effect of this. It doesn't matter if it maps onto an existing hardware model or not. What I'm saying is that it IS an execution model. The only high level concept that I've seen is mapping transformations over a data set. Too bad this spits in the face of FP.

There are other ways of doing things. I've mentioned quite a few obvious indications of where things are less than optimal. Fixing these things is not rocket science. Implicit concurrency, reuse, implicit pipelining/parallisation, portability (without VM), variable levels of abstraction within same environment from hardware to high level, etc are all ridiculously easy to do (for a language and OS designer). I'm reluctant to say that I'm the only one can fix these things. I want to know WHY no one has fixed it yet. It's been possible to do so for at least 40 years. I'm not a language designer, yet *I* can fix these things. No, there has to be people who know what's going on and aren't talking. Why?

The only high level concept

The only high level concept that I've seen is mapping transformations over a data set. Too bad this spits in the face of FP.

Mapping transformations over a data set spits in the face of FP? I think you must be seriously confused.

By the way, what would you use to express a transformation other than a function?

I'm not a language designer, yet *I* can fix these things.

OK. Great. Go do it.

so

this:
"I'm not asking for people to provide something that I'm looking for. I'm looking for the reason why people don't see what I'm talking about. If you don't see it, then ignorance is a perfectly acceptable reason. There's no shame in that."

would imply that the ignorance of those receiving your communication is a perfectly acceptable reason for not seeing what you're talking about. Evidently poor communication on your part is not a perfectly acceptable reason for not seeing what you're talking about.

It is natural to assume that people do not understand you because of deficiencies in their understanding, rather than deficiencies in your communication. It is somewhat rude to state this assumption however.

Good point. My suggestion to

Good point.

My suggestion to LtU members is to be keep in mind that new users (and in this cases someone who signed up less the 24 hours ago) shouldn't be able to draw us into flame wars.

Defending and offending

Is my question really that difficult to understand? If so, then I certainly did not mean to be rude. Some people have commented on why they don't agree with the basis of my argument and that's great. It explains their point of view and also explains why certain things are the way they are. At the same time, certain other comments show that they completely miss what I'm getting at even though they know better. So that's my basis for saying what I said. I've backed it up. It wasn't meant to be rude.

Furthermore, certain people here are trying to sidetrack the discussion with the "solution" or asking what "problem" I'm trying to fix. Why the obsession with irrelevant details?

Personally, I find the nature of many of the responses here rather astonishing when asking a straightforward question. It's like talking about a technique or programming style is somehow equivalent to a personal attack. I find this behaviour very alarming.

In your original post you

In your original post you commented on things like memory allocation and locks, which many languages do not have to deal with. This led me to doubt you really had much knowledge of languages outside of Java, etc.

I can see what you are saying, but it would help me understand more if you explained why Haskell for example is "so low level". You don't deal with memory and you don't deal with locks, you can return multiple values via tuples and you don't specify the order things happen.

The solution surely isn't irrelevant. If current languages are too low level, I want to know how to solve this problem.

The solution is irrelevant

The solution is irrelevant as your solution may differ and still be correct. I was only trying to say that one such solution does exist without having to go into specifics. However, I would be happy to talk about ideas and concepts on the subject if anyone is interested. But even before getting to that, we have to answer if there already exists languages that don't have an execution model (or other low level concepts). Even if others think I'm trying to be prophetic, 6/6/6 has already passed and I'm genuinely interested in previous work on the subject. I can't be the first to ask and Niels comment below demonstrates this fact. Nor can I assume that it hasn't been asked (and possibly solved) on many occasions in the previous 40 years. *I* haven't heard about it and would like to know why there isn't anything available on the subject.

This bring us to your suggestion of Haskell. I'm not a Haskell programmer, but it does have quite a few execution statements. Most notably, it also has bindings to data items which creates coupling inside your function and thus prevents or hinders modification of the execution model. In most cases, it is identical to imperical languages where you list a sequence of commands that must be executed in order. I may be wrong in some of these conclusions, but my point is that there is an all too clear execution model in Haskell.

We can now discuss one of two things. Do you reject the idea that Haskell has an execution model? Conversely, we can accept it and eventually move towards dicussing alternatives to execution models, barring any other suggestions. Actually, we may even discuss if you believe an execution model is low level and that you don't believe an alternative exists or is even necessary. Any of these, I would be interested in discussing.

Infuriating

I was only trying to say that one such solution does exist without having to go into specifics.

If you have a solution, then spell it out. Otherwise you are just playing games with people. It's rather audacious to state the whole industry is doing the wrong thing and claim to know the solution, but then not talk about the solution. How can anybody take you seriously?

Consider the author of Recursive Make Considered Harmful. Here was a case where everybody was doing the wrong thing, and he beautifully explained why. He didn't do so in a condescending manner, or play 20 questions, or try to investigate philosophical questions.

If you really aren't a troll, then stop jerking us around and propose a solution.

You appear to be complaining

You appear to be complaining about the do notation, as is used with monads - a structure that's deliberately supposed to have a notion of sequencing and was introduced into the language precisely because it was discovered that some code needs this notion. In most programs, monadic code is in the minority - in the exceptions, this is usually because that's the nature of the program in question!

I fail to see how pattern-matching and other binding constructs make a language low-level. Naming things is an abstraction mechanism with a long history, and the only even vaguely good argument I've ever seen against it is that it makes refactoring harder work.

Is my question really that

Is my question really that difficult to understand?

Yes, it is. What do you mean by "high-level" ? Could you give an example of what you're trying to do ?

So far, you have left us without a clue of

  • what you want (hint: examples would be nice)
  • what you're complaining about (hint: examples would be nice)
  • what you've looked at
  • what you know.

Or are you just trying to start a flamewar ? I'm starting to believe your post is actually a joke.

1. I want to know previous

1. I want to know previous work on language design techniques that don't involve low level concepts such as execution models, threading, memory management and more specifically why there isn't more out there on the subject. I can't give examples of something that doesn't exist.

2. The only thing I may be complaining about is why language researchers haven't solved this problem already (if they in fact haven't).

3. Everything I've looked at doesn't apply. That's why I'm posting here.

4. With all due respect, there isn't enough time for that. Anything specific?

Why in the world would I ever want to start a flamewar? And what would lead you to believe my post is a joke? Is this how all new members are treated? I'm really confused by the reaction of people here.

1. I want to know previous

1. I want to know previous work on language design techniques that don't involve low level concepts such as execution models, threading, memory management and more specifically why there isn't more out there on the subject. I can't give examples of something that doesn't exist.

Well, if you claim that you can solve it, I expect you to have a solution in mind, hence examples of that solution. But perhaps I misunderstood what you wrote.

Let's forget for one second that most programming languages designed since 1980 don't require memory management. Let's also forget that both threads and control flow operators are actually quite natural manners of thinking about many problems -- not all -- regardless of execution models.

I have a few languages in mind, which do not seem to involve execution models, threading or memory management. Please tell me if you share my opinion, or if I misunderstood your question:

  • Prolog
  • Datalog
  • SQL
  • ObjectVision.

2. The only thing I may be complaining about is why language researchers haven't solved this problem already (if they in fact haven't).
Until we understand exactly what problem you have in mind, it will be difficult for us to answer.

3. Everything I've looked at doesn't apply. That's why I'm posting here.
And what would that "everything" be ? We have no clue whether you know about Smalltalk, Oz, Bigraphs (yes, it's not a language yet, but it might become one some day), the syntax-less game development toolkits around, etc.

4. With all due respect, there isn't enough time for that. Anything specific?
Fair enough :)

And what would lead you to believe my post is a joke?
Because, so far, you've failed to communicate. With a similar thread on Usenet or Slashdot, you would have been flamed to Hell seven times already.

Prolog execution model

I'd say that Prolog does not apply, because you cannot write any serious Prolog program without considering the specifics of its execution model, i.e. how search is performed.

In the midst of writing SQL...

...you also have to be cognizant of the performance in constructing the schema and queries in SQL.

As you are likely aware, the answer given by the constraint programming community is to seperate the logic aspect of a program from the search aspect - allowing a declarative model for the domain and constraints, while providing the flexibility in the search for performance.

CP declarative? No

Of course, that is just wishful thinking. One could even argue that constraint programming is the worst of all. Whether a constraint formulation of some problem is practical very much depends on what underlying propagators it employs, how strong they propagate, etc. Finding the right constraints for a task can make the difference between a highly efficient and a totally useless solution. And in fact, finding a working one for a given problem requires a lot of experience, is totally implementation-dependent, i.e. very much constitutes an art. Good solutions often are worth a publication. ;-)

Well, that's the idea anyhow...

...in fairness to CP, it attempts to tackle hard combinatorial problems, so it is to be expected that efficiency problems will crop up no matter how you slice and dice the solutions.

Why?

Those who know do not say. Those who say do not know.

The rest of us are using Intercal.

Some higher level programming languages

Here are a few examples:

HTML - The browser is the VM, pay particular attention to the table-layout algorithm.
XForms
Postscript
Makefiles - Make, Jam, Ant
Hibernate/J2EE schema mapping files
Prolog
SQL
Spreadsheets - there are several text-based formats like SYLK
Google's Map/Reduce
Grid computing languages, PVM
Concurrent C
Game scripting engines - concurrency, AI
Simula, MMORPGs

Java tried implicit concurrency. I don't see it being mentioned very often. Presumably people using Java are predominantly writing CRUD applications.

DLINQ tries to turn C# into CRUD DSL.

As to why things don't change:

Vista's most recent reset had rewritten managed code in C++. The kernel is not ready for automatic GC.

There is a large body of C code to interface with. Windowing, sockets, database drivers. No big project starts from a scratch. One cannot interface with old code without knowing the memory and execution models. Most languages are designed without regard to FFIs. (C# and VB are the exceptions I know of. They deliberately try to play well with C DLLs).

O Lord

Help Thou Thy loved ones to acquire knowledge and the sciences and arts, and to unravel the secrets that are treasured up in the inmost reality of all created beings.

Troll?

This is either a troll or not appropriate for LtU. Reader our archive, and you will find plenty of answers. We prefer to discuss specific, well defined, issues.

I'm making an assertion that

I'm making an assertion that execution control is low level. Understanding why this is the ONLY method used in programming languages can serve as an indication where the problem in implementing alternatives lie. Can you define what you mean by "specific, well defined, issues" when speaking about execution control or abstractions? I think I'm bringing up a very "real" issue that deals with programming languages. Is there no room for programming language concepts and discussion at this forum? If not, I will gladly leave, but calling me a troll is unnecessary.

I do not want to talk about the "solution" because I want to discuss this particular issue without it being sidetracked. In the field of computing, people tend to go on tangents about irrelevant details. If people want to discuss alternate language implementations, then fine, but let's do it in another topic.

I've also read virtually every single letter on this site several times. I didn't post here without doing an appropriate amount of research. Thank you.

Pretty sure its a troll..but

You are going to have to tell us what exact issues you have, because you are being very abstract and unclear.

The only way this discussion can continue is if you give us your solution, explain in very clear terms, or explain why Oz or Prolog don't solve your problems.

Sigh! Never mind. I think

Sigh! Never mind. I think this answers my question why we're not moving forward quite clearly.

Perhaps because you wouldn't

Perhaps because you wouldn't know a bunch of programmers who "get it" if they pointed you to an exact example of doing so?

Well, I designed this PL in my head....

...that solves all the known (and some unknown) problems in the universe. And if you don't know what the syntax and the semanntics of my PL, then I guess you have also have an answer to your question.

Obtuseness is not a virtue. Eschew obfuscation.

Duh

execution control is low level. Understanding why this is the ONLY method used

Because it's the way C and by extension C++, C# and Java work, and because whenever some other concept shows up (say, functional programming), people complain that it is not mainstream (read: unlike C).

Seeing that you used the latter argument yourself and reading the other comments, I'm ashamed that I fell for the troll and will shut up now. Anyway, if you have an answer to your unasked question, post it (and get shot down for the most probably severely flawed idea).

Cut some slack

Instead of calling someone a troll I would like to cut some slack here. Though non-specific and making claims without giving proper directions I do find the point this contributor makes interesting. In fact it relates to the point I tried to make in Let's make a programming language!, which remained unnoticed. I do find it interesting to see if we can abstract away from any execution model. Is there a way in which we as humans can describe a certain problem in such an informative way that a computer is capable of creating a program out of the information given? Is it possible to set up a dialog between a computer and a human in order to retrieve the necessary information to obtain an appropriate program? I like the idea of starting from the human (high-level) perspective and see if it is possible to decend to the mathematically formal perspective that makes execution possible. So please, lets discuss this subject; I'm all for it.

From an avid lurker

As a long time denizen of Usenet, I can say with high confidence that the individual is a crank.

1 part martyr, 1 part voice in the wilderness, he is espousing that there is an incredible Truth only he knows. He has already said that he is not looking for education or discussion, he just wants to know why everyone else is not Enlightened.

Why not start a separate thread for the discussion you want to have? It sounds interesting.

This is exactly what I

Niels: This is exactly what I wanted to discuss. (Apologies for the numerous comments and for any misgivings to others).

I think you're leaning

I think you're leaning towards something like Inform 7, is that correct (now, that is kind of a DSL, but still a PL)?

I was about to suggest this, but according to the OP *ALL* existing languages were too low level (Inform 7 exists), and he had "read virtually every single letter on this site several times" so he is obviously aware of Inform 7 but it is still not what he's looking for. Besides, Inform 7 still also has stuff like conditionals.

Before we go on, is that closer to what you want than Haskell or Oz?

Domain-specific knowledge

That's an interesting question.

HyperCard attempted to allow users to write in plain English what they wanted their computer to do. To a large degree, it succeeded, in its domain.

SQL attempts to allow users to write in weird English wnat information they're looking for. Although I assume it could be improved still, this looks like a success to me, in its domain. More importantly, perhaps, I'm pretty confident SQL queries can essentially be generated from purely visual tools, by interaction with the computer, without writing one line of syntax. Of course, you still need to understand what tables and views are about.

I'm currently working on a similarly domain-specific language (with a semantically sound basis) for manipulating real-time sensors. Hopefully, this language (Csar) will also be a syntax-free language. Of course, you'll still need to understand what sensors and probabilities are about.

Many tools permit coding webpages or user interfaces without one line of code. Of course, if you want your webpage to be correct, you still need to understand boxes and a little bit of typography.

The problem, I feel, is when you start to enter general-purpose languages. Because you need to explain in detail a whole bunch of things to the computer. Because we assume that, if a programmer uses a general-purpose programming language, it's because he knows the subject better than the language designer does. I have the nagging feeling that there's no way we can avoid this. Do you think there's one ?

I disagree with the premise

For most programmers, PLs are a form of communication between man and machine. The premise put forward is that the communication spends much too much time explaining to a machine how to go about doing things it should be smart enough to figure out itself. Sure, we'd like a higher level language that expresses the solution in terms of a language that is closer to our own form of thought. But ultimately it still has to be mapped across.

Although I sympathize with your phrasing of the question - and not the OPs 419 marketing strategy for PLs - I think that it begs the fundamental question of what kinds of problems you are attempting to solve. I think generalizations can be interesting, but they ultimately won't be that meaningful. And the undertone of all such general discussions which don't have a point of solving a specific problem is that they can drift to an elitist viewpoint that all programmers are complete gits.

Things change the world

As an engineer/programmer I often try to explain to people that "things" really do change the world. It is debatable whether mere "ideas" or "concepts" really change anything.

Call me when a thing helps you enlist an army...

This is really off-topic, but I must make a point here.

Things can sure help change the world, but even the most wonderful tool or most horrible weapon will lie still gathering dust unless there's an idea of what to use it for and why. It's true that an idea without an action means less than an idea with an action. Ideas without application are indeed impotent, but things without ideas are equally impotent. Even discounting that the thing started as an idea, it depends on an idea for its use once it exists.

Things are only useful by letting us project force more broadly, focus force more finely, deflect a force from us, prevent application of force unless necessary, encode ideas about proper use of force in them (a glass cutter vs. a diamond chip, a saw vs. a hammer or ax) or convey ideas from one person to another.

That last one is largely what programming languages are for -- to convey ideas about a topic from ourselves to ourselves, and from us to the computer. The computer, a thing, is only as useful as the ideas it lets us convey to one another, the machinery it controls (precise application of force), or the ideas it lets us play with (simulations, design, drafting, video games, word processing) rather than expressing those ideas in the physical world.

If mental effort saves us physical effort or makes us safer, then a thing is useful. That thing, therefore, requires ideas by definition. Unless by things changing the world you mean allowing large objects to randomly fall on us. Since you are an engineer, I'm sure that's exactly the opposite of what you intend things to do.

"Thing-ness"

My thinking about "things" probably comes from reading Martin Heidegger a long time ago. There is a famous essay "What is a Thing". And I don't think this is really off topic because "thing-ness" has a lot to do with language since languages describe: well "things". But we don't need to limit thing-ness to physical artifacts. Tangible social artifacts, institutions, religions, in so far as they "operate" or function to some end also have thing-ness. The city or a town has been referred to as a thing or system. There are many examples. The category of things is really different from the category of ideas although they certainly overlap. A constitution is no longer simply an idea. You can't debate it or kick it around because it can put in jail.

In that case...

Yeah, "systems" do change the world. Judaism, Christianity, Islam, Buddhism, democracy, republicanism, capitalism, socialism, Multics, Unix, assembly languages, Fortran, Lisp, C, Perl, SQL, OpenGL, wxWindows, DirectX, the Boy Scouts, The John Birch Society, the MPAA, the RIAA, Harvard University, MIT. They're all "systems" in one way or another. Most or all of them have chanegd the world we live in in some way (to different extents, obviously). They could certainly be considered "things" according to the definition you give, but the connotation of "things" in your previous post wasn't as clear to me.

Clearly, what is a "thing" or "system" and what's not has a lot to do with types, data collections, and computability in general. In this particular context, it makes sense that thinking about "things" changing the world has a lot to do with higher-level programming languages. Moving from code being the central entity and data being passed around to actor-based, dataflow, or object-oriented programming requires a lot of thought about "things". That includes how to represent them, how to make the computer deal with those representations, and how to operate on those representations.

So you're quite right. Now that I recognize your angle, I realize it could easily be within the topic of the thread. Also, most of the "ideas" I mentioned previously would fall as "things" by Heidegger's definition.

popper's objective knowledge

Some folks might think philosophical subthreads are off topic. But they can touch on underpinnings of how programming languages are significant in human terms, especially in any context where computing systems strongly affect the way people think, or how they conduct affairs. With only one exception, I usually ignore what philosophers say as irrelevant to computing.

Karl Popper is one of the few philosopher's credited with having much sensible to say about theory of science and the interaction of subjective and objective phenomena. Some folks in computing circles might find parts of Popper's views on "third world" interesting when evaluating the meaning of how computing systems embody systems of knowledge in the world outside a human mind.

(I first became interested in the interaction between thought, systems, and "things" around 1980, when I thought it might be possible to invent a new language -- possibly mediated by computing systems -- that made people less crazy, or more intelligent, whichever was more practical. It didn't take me long to run across Karl Popper's Objective Knowledge [1972], which exposed me to more than I could digest at once. Since then I haven't seen much as applicable.)

Wikipedia has an article on Popperian cosmology, which might be a good place to start following links on the topic of Popper's notions about "third world" structures and reified knowledge. However, it might not do much for you than provide a context for situating old arguments in a new setting that's more amenable to analysis.

You might say Popper addressed the question, "Are ideas real?" Then he pointed at things we create in the environment that encode our ideas and affect the world.

I really didn't mean to say Hank was the one off-topic

I thought all along that Hank's original post was at least marginally topical, and in the context of his follow-up it seems even more topical.

When I said the topic was straying from the thread at hand, I meant that my response was the part straying. To Hank's credit, he explained to me not only the context of his previous statement, but also helped me view my own statement in terms of the thread.

Personally, I believe that programming and philosophy can't really be successfully separated at all. Perhaps they can superficially, but fundamentally any software developer who doesn't ask some questions about what things are and how they are related is bound to implement something less useful than otherwise possible -- and will probably take longer to do it.

Programming language design can really support the relationships between related chunks of data. Whether it is as simple as arrays/lists/trees holding like values or as complex as every real-world object being modeled separately as an object in a program, the data inherent to an entity and those relating one entity to another are really what software is all about. How a programming language enables us to represent that data and what transformations each language makes easy or difficult are some of the biggest and most pressing issues I see in language design.

In fact, it's my understanding from Vorlath's posts that he really wants a language in which he can state what attributes an entity has and map relationships from one entity to another, then get a result from the computer. The idea is fascinating, but aside from some 4GL products, some layout and markup DSLs, and similarly constrained environments I'm not sure how to relay the intended start and stop states using so little information.

Ideas

Chris: The big problem that I'm hearing about is that with multi-core processors and even with processor arrays, writing software for these machines is rather difficult. How do you map an application to multiple processors? If you have an execution model, it becomes almost impossible to adapt your application to an alternate execution environment because it's been locked in already. I made the unfortunate mistake to mention that there was a solution. I don't know what the BEST solution is. I just know that there *is* a solution. I came here in the hopes to discuss techniques and ideas. But like I said, I am not a language designer, so I'm baffled that there isn't anything out there on this and I want to know why.

It would take too long to explain the whole thing and I don't have all the answers when it comes to implementation details as each execution environment is different anyhow. But since I'm forced to jump the gun, I'll try give a rough draft on what I'm thinking.

I think a good place to start is your comment that "PLs are a form of communication between man and machine". You also mention that we spend "...time explaining to a machine how to go about doing things". Perhaps that is the problem. Why are we explaining how to go about doing things? Instead of telling the machine every step and operation to execute, should we not be describing how we want our data to interact and let the compiler or OS decide how and where this is done?

I don't want to get into details now, but I'll leave with this. What would happen if you modified a pure functional language in the following manner... (I'm using an alternate method, just to show that there are multiple ways that could lead to a solution)?

1. The body of the language cannot have side-effects, cannot produce values from external sources (other than arguments) and cannot send data to external sources (other than return value). All it can do is manipulate data given to it. So no loops or any execution control or simulated looping via recursion either. Conditionals are ok.

2. Add two new constructs in the language. Data Generators and Sinks. These pump out data or absorb data automatically. You do not call them. They are set in motion on their own. For example, terminal input will be pumped automatically. You don't actually write a function to fetch the input from the keyboard. The values will automatically insert themselves as parameters and the data itself will trigger the "execution" of the appropriate function and automatically keep going on its own after returning, to whatever data sink it's supposed to go to.

3. The Sinks, Generators and topmost functions can be linked together unidirectionally in a free flowing graph. You can also split and merge data so that they may go to multiple Sinks at once. They should also be able to go to generators to provide further selection of how the data is to be generated. Generators can be anything from number generators to files to sockets to GUI, etc. Sinks are the same, but reversed (write). Note that events now fit this system very well in the form of user activated generators in the case of GUI's.

Just to clarify... topmost functions, generators and sinks do not return. That's how they differ. They are chained linearly, not recursively. They are constantly active at all times unless no data at that particular juncture, and will reactive on their own when data becomes available. If that last sentence doesn't give you the jiggies, I don't know what will.

Can anyone tell me what advantages this would produce? Mind you this is a start by using pure FP. Other much more powerful things can be done with that starting point. Concurrent feedback loops (state without side-effects), distributed programming and zero execution control as a few examples.

All this is irrelevant though. It's the ideas behind them and how the whole is put together. The reasons behind the above changes. Strange that the solution is more important than the concepts. That's just completly backwards to me.

Now I wait to get burned.

So no loops or any execution

So no loops or any execution control or simulated looping via recursion either.
If you look at it, you'll realize that you're actually simulating looping with sinks and generators. Which is probably good, because otherwise, your language wouldn't be Turing-complete.

Can anyone tell me what advantages this would produce?
Concurrent ML. Or Acute if you want a distributed version. Or Pict. Or anything based on the pi-calculus. I guess that's also something you can do in Mozart, Chalk, JoCaml, Links ...

I've programmed in that style. It's quite user-friendly when it comes to concurrency and, I assume, user interfaces. It's also quite user-unfriendly when it comes to actually returning results. It's not higher-level than Haskell, it's just based on a different execution model.

In fact, it sounds a lot

In fact, it sounds a lot like the idea of allowing general recursion in an equivalent of an IO monad but not in pure functions. Which rather reminds me of Epigram for hopefully obvious reasons...

It's a fair question...

...but I don't like the mysticism implied by not naming a solution - whether it be best or one of many. PL design is about tradeoffs, and though we could come up with a PL that met your criteria, you'd likely find that there's a whole class of problems that become harder to solve working within those constraints.

It's not without accident that many who have posed similar questions of abstractions (McCarthy, Landin, Kowalski) have put out their proposals to answer the question (Lisp, FP, Prolog). Coming up with the questions is the easy part. The hard part is defining the problem to be solved, and then coming up with a solution.

Dataflow languages

It seems to me that what you are looking for is a dataflow-oriented programming model. For some reason, this programming model doesn't seem to have taken off very well; I suspect that it's because most programmers find it fairly unintuitive, and because its difficult to express it using linear syntax.

Take a look at StreamIt. It's a dataflow language for streaming applications, and it is sort of going in the direction you've outlined (although the base language is still procedural).

All that said, I have to admit that I don't understand your objection to recursion. Recursion is a precice way to state the meaning of a program; the fact that it has a straightforward operational interpretation is a convienient bonus. Its entirely possible to take a recursive function and transform it so it's not recursive, or has a different recursive structure, etc. For example, GHC's list fusion optimizations does this. While recursion suggests an immediatly obvious operational mechanism, you aren't necessarily tied to it.

Dataflow

What you have described is, roughly speaking, declarative concurrency. Concurrent languages such as occam, Erlang, and Oz could easily be used to write programs that conform to the model you have described (this presentation by Peter Welch includes some simple examples of this kind of thing in occam, van Roy and Haridi's Concepts, Techniques and Models of Computer Programming contains several examples in Oz). The difference is that those languages also allow other kinds of process networks to be described as well.

The paradigm you have described has been used in the past, and is being used today. But it can become cumbersome in some problem domains, and insufficient in others, which is why most concurrency-oriented languages also provide additional constructs. For example, it's sometimes desirable to allow a process to select from a range of possible input streams, which a purely functional data transformer can't do. Similarly, while you can represent state as a concurrent feedback loop, it's a pain to have to do that every time you want to use state - it's easier to define a higher-level abstraction that encapsulates some local state such that the resulting process is observationally identical to a function+state feedback loop. That's effectively what Erlang does with its "process = tail-recursive function" model.

You may find it worthwhile to pick up a copy of Concepts, Techniques and Models of Computer Programming. It describes several different models of concurrent computation in far more detail that we can possibly get into here, and discusses the pros and cons of each one.

To me this sounds a lot like

To me this sounds a lot like emulating hardware. The data flow with functions can be done with pipes and functions. But I thought I heard the word feedback. This is definitely hardware, you can simulate it but it must be syncronized. In the engineering world digital signal processing is sometimes used instead of a general purpose computer. If one is familiar with the technique it is easy to go back and forth.

Yeah, dataflow..

Your description also reminds me of a dataflow-like language which, I guess, are good at processing streams of input to output.

A major drawback of these kind of languages, I presume, is that they are just not very good at representing complex algorithms on complex structured data, and I do not see how you would get rid of complex structured data in a large number of cases.

Furthermore, in any language which is turing-equivalent, it is possible to state problems which are, well, just hard to solve (even in a mathematical sense), except by clever coding (as in finding the right mathematical abstractions). Are you postulating a GPL in which it is very easy to, say, solve any mathematical problem which can be stated as a program?

Is there a way in which we

Is there a way in which we as humans can describe a certain problem in such an informative way that a computer is capable of creating a program out of the information given? Is it possible to set up a dialog between a computer and a human in order to retrieve the necessary information to obtain an appropriate program?

Is there a way to decribe problems in a precise manner that doesn't ultimately end up looking like programming? Design is not a mechanical process. It involves making decisions. Those decisions need to be either encapsulated as assumptions within the code generator (in which case the applicability of the generator is limited), or made by those writing the specification (at which point it looks like either "programming" or "configuration" depending on when the decisions are made).

That said, the answer to your question is "yes". Such things already exist for restricted problem domains. For example, the SPIN model-checker takes a protocol description and property to be checked expressed in Promela, and synthesizes a custom protocol analyzer in C. In a different direction, there are plenty of pieces of software out there that let one configure the system (live) by loading or unloading different plugins. Is not the act of configuration a "dialog" between computer and human? Why generate custom code (which stops being flexible) when you can simply perform runtime configuration that lets the dialog continue forever? It's a short step from there to REPLs :-)

I think that it begs the

I think that it begs the fundamental question of what kinds of problems you are attempting to solve.

I agree with you on that one. That is why I suggested in the Let's make a programming language! thread, the collection of language constructs programmers would like to use in order to express certain problems. General purpose programming languages allow the creation of infinitely many programs, while the problem domains humans address using general purpose programming languages is finite and maybe not that big after all. Maybe there is an approach to map human-centric utterances concerning those "real world" problem domains to mathematically formal constructs that allow for execution. I am interested if such an approach is feasible and whether such an approach has been tried.

Not a new question

The classic Can Programming Be Liberated from the von Neumann Style? presented the question in a bit more systematic fashion. Or one can even go back further to the '50s with McCarthy's attempt at symbolic computing in designing Lisp. Now neither of these approaches (FP or Lisp) may have sufficiently solved this age old problem, but it is a good starting point for understanding how to improve the process of programming and the art of designing a PL which accomplishes a stated goal.

[Edit Note: The other historical treatise on the subject would be Landin's The Next 700 Programming Languages.]

Maybe there is an approach

"Maybe there is an approach to map human-centric utterances concerning those "real world" problem domains to mathematically formal constructs that allow for execution."
It is easier than most people think. Basically it comes down to knowledge representation and logic programming. Design problems can often be solved this way. Transformer design is an example I like to use. Knowledge representation allows you to represent ordinary description much as it is in a book for instance.

The main problem, as I see it, and many people disagree about this, is with the available tools. I have my own "pet" languages. Unfortunately I spend more time on the languages than the problems that originally got me interested in this. Interest in this area I think has been waning since the "AI debacle" of the late 1980's. Apart from amateurs like myself I don't see much interest. A lot has been cleared up in the last 15 years, and there is a lot of applicable new research. Frankly I think the field is under appreciated.

Algorithm = Logic + Control

Perhaps it would be useful to focus on particular approaches, and why you think they are not appropriate. A paper which has been mentioned in passing on LtU before, but doesn't seem to have had much discussion (at least, I can't find any in the archives) is Robert Kowalski's Algorithm = Logic + Control from 1979 (ACM, free pdf here). To quote from it's abstract:

An algorithm can be regarded as consisting of a logic component [...] and a control component [...]. The logic component determines the meaning of the algorithm whereas the control component only affects its efficiency. [...] We argue that computer programs would be more often correct and more easily improved and modified if their logic and control aspects were identified and separated in the program text.

This seems to be close to what you are arguing for. More recently, Functional Relational Programming offered a good discussion of complexity caused by state and control issues. Perhaps a more fruitful discussion could be had by considering these previous approaches (and the languages they've inspired, e.g. Prolog, Oz, etc) and how they fail to live up to your notion of a "high-level" language?

Was looking for that paper not long ago...

....but never could find a copy that didn't require ACM membership. Thanks for the link!!!

Limited Resources

When we're all running around with digital watches that pack 800 Exaflops in their slowest component (being, after all, just cheap digital watches), then, as you say, execution models won't matter so much.

"Using a Ω(n^n) algorithm on the Oxford English Dictionary? No problem, just specify it."

In the interim, execution models matter, and thus extant languages will be designed with them in mind.

If you wish to proffer a solution wherein you write the code without regards to execution, and the compiler/interpreter/whatever derives an efficient execution model for it, I'll happily beta-test it for you. :0)

I agree that execution

I agree that execution models matter. However, I think that the excecution should be the responsability of the OS and perhaps the compiler and not the developer. What can have a better understanding of the hardware and execution platform than the OS?

I have a few ideas on the solution that I am still working on. I've found valuable information here that I can use despite the secondary noise. If you are sincere about being interested in testing such an environment, I will contact you or anyone else interested in the future when it becomes available.

Also interested

I would also be interested in learning about your approach and trying it out.

I am also miffed by the clumsiness of the mainstream programming tools, and am trying to identify an alternative. (Unfortunately, my lack of expertise in language design isn't helping me.)

Fair enough

Whenever you feel like explaining your solution, I'd be interested in reading it.

Thanks

I can't do it on LtU as this will be my last post. But keep an eye out in a few months. I'm sure it will pop up.

I'll leave with a story. About 20ish years ago, my older brother introduced me to programming. Sure, it was just BASIC and assembler, but then I started looking at other things out there like Fortran, APL and others (and yes, eventually even FP and other styles). My brother having gone in another direction wanted to get back into it not long ago. He asked me where the new stuff was. I said that's all there is. He does not understand why we still write software the EXACT same way we did 20 years ago. Sure, there are some cool tools and such, but nothing actually new. And it seems responses to this goes in two directions. Denial or attack the messenger. But discussing the issue seems to be taboo because it seems to invoke a sense of personal attack. Maybe that's the PC culture that we live in that we can't mention issues anymore. I don't know.

On this forum, I see people who think that functional languages are somehow different. Yes, they are more mathematically inclined and may be more interesting as far as proofs and research is concerned, but they are not that different. They do have an execution model just like any other language. I tried to explain it, but this has fallen on deaf ears.

So my point is that even if I were to attempt to explain my ideas, they contradict accepted notions, especially about functional languages. So it would serve no purpose but to make people believe I came here to start a flamewar since I'm obviously wrong. But at least the negative comments here have given me confirmation that there is a real denial going on. This by itself has answered a great deal many questions. So at least some good came of it.

Thanks to everyone who gave actual documentation and comments on the topic at hand. They have helped. And apologies if I was too disruptive. To the rest, well... perhaps looking outside the boat from time to time would help a little.

Just for fun...

Just to confirm your complex, I'll drag out an old program of mine that uses perhaps the simplest PL ever created. :-)

>+++++++[     <       +     +       +     +       +
++            >-     ]<     +[     ->     [-     ]>
++            ++     ++     [<     ++     +++    ++
>-            ]<.>+++++     ++     +[     <+ +   ++
+++++>-]<     +.-------     --     --     .+  +  ++
       ++     ++     ++     ++     +.     --   - --
       --     .[     -]     >++++++[<     ++    +++
++>-]<.[-     ]+     ++     +++++++++     +.     <]

Just a suggestion

Next time, try to sound less like a troll. Whenever this blog's regulars ask a question, don't throw it in their face while playing martyr. Rather, try to answer. Whenever people don't understand what you write, don't invoke denial, close-mindedness or anything such, especially when what you write is so unclear.

Just my two cents. Good luck with what you're trying to do.

Hard to do

Unless your selections will be trivial, like hinting that a given container will be accessed in more of a linear fashion than a random one, what you're describing is easily a PhD thesis's level of effort. Make sure when you explain it to me that you use small words! :o)

[Oops, this should be under the comment just above it -- anyone know how to move it?]

Vorlath's blog

http://www.my.opera.com/Vorlath/blog/

Well, at least the fact that you have a blog whose sole purpose is to talk about this very issue means that you're not a troll, or maybe a very good one. :)

I think you should look at COSA

See http://www.rebelscience.org/Cosas/Reliability.htm

I think you would find this to be highly relevant. From what I can tell, the COSA guy wants software which works pretty much in the manner you propose. He sees it as being an entire operating system -- even eventually a new CPU architecture, which eschews the Von Neumann architecture for an inherently parallel one.

I only skimmed his site. Hard to tell if he's got anything concrete, or if its just theoretical. Anyway, from what you've said in this thread, I think you would find COSA to be interesting.

Signal flow computing is of

Signal flow computing is of course not new. Before about 1960 most computing was done with wired up components. After 1960 transistorized general purpose computers quickly became the implementation method of choice. At the same time the signal flow method moved in the direction of digital signal processing, and was used where high throughput was needed. Laying out a computation as signal flow is actually a language in it own right because one can then emulate or simulate the signal flow on a general purpose computer.

Vorlath's blog,

Since contribution seems to be broken in Vorlath's blog, I'll post my conclusion of the thread here:

What do you use to specify to the machine what it's supposed to do? Isn't that a language? Probably English? German? Even if it's telepathic, the machine of your dreams has to detect patterns in your neural signals, which to me is structure analysis (in some contexts also called parsing).

Statement one: Semantics is expressed through (not necessarily strict) structure, adhering to conventions.

And next, if you express conditionals in some order-independent fashion (with some clever time-independent interpretation of conditional patterns such as ?: etc.), at some point in time (!) there comes the need to build upon previously computed results. And there sequential order comes into the scheme.

Statement two: Sequential order matters in computation. Not in all cases, but in quite a few essential ones.

Unless you manage to disprove my two statements, I'm sorry that I have to say that all your statements so far remain nebulous and unconvincing, and that the stuff invented previously by more or less mainstream researchers, somehow contains the rare glimpses of concrete ideas already. There is nothing new under the sun.

Here is a Magic Wand...

...you are welcome to use it.

It will do absolutely anything.

Anything. Blow up the world, solve world poverty, it can do all and everything.

I mean that.

The problem is to specify, to the Magic Wand, what you want, sufficiently precisely so that does _exactly_ what you want, and _only_ what you want.

The Magic Wand is called a Computer.