Have scripting languages peaked?

Well, since language popularity seems to be overly emphasized in any discussion on PL, thought the following article on Have scripting languages peaked? would be of interest.

A recent study from Evans Data Corp... The study, published on Thursday, found that the number of people using PHP for development in Europe, Middle East and Africa fell by over 25% last year and that the number of developers who would not evaluate PHP for future product grew by almost 40 percent. Similarly, Perl and Python also saw significant drop-offs in usage and planned usage.
I think of Perl, PHP, Python and Ruby as mainstream programming languages, but it would seem that many consider them to be Strange or Exotic.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Wouldn't put a lot of faith in these numbers

C# is probably picking up some PHP users, granted. But these surveys are usually funded by some vendor with an axe to grind.

I'll believe it when it's peer reviewed. ;-)

Random sample of one

Got approached out of the blue to do some PHP work last night, so I'd have to say that they still got the big momentum. (Now if someone would just approach me to program in ML, I'd be a happy camper).


Now if someone would just approach me to program in ML, I'd be a happy camper

Amen to that.

I am working on a fun little project for a doctor in Haskell. When I first met him he said that he was looking for a C++ programmer, but I quickly convinced him otherwise.

ML vs Scripting Language for prototyping

I do a lot of prototyping and "proof of principle coding" of message processing systems. My tool of choice have always been Tcl (which has great networking and event handling support). Since this was "just" prototyping, it wasn't too hard to convince my employer to let me use something other than C++ (our mainstay language. I had to justify the language choice (the prototypes feed into system design so it has to be understandable by other developers), but Tcl is somewhat familiar to people here.

Now, how would I justify using something unfamliar like ML? Well, a recent proof of principle needed to demonstrate handling several hundred concurrent connections from client apps. The scripting languages tried (Tcl and Perl using event dispatching techniques to simulate concurrency) simply couldn't scale to make the prototype feasible. How about Java or C++? OS level threading would have been a nightmare (for the massive number of concurrent connections) However, Concurrent ML easily supported the scenario I was prototyping.

So I was able to use ML for several months on my job, because otherwise the prototype wouldn't have been completed on time.

Now, of course, there will be the interesting task of casting a design fed by a COP/FP approach into C++...

OK, I'll Bite...

...why not put the Concurrent ML into production, perhaps after some training in Concurrent ML for the rest of the team? Unlike TCL, it's compiled to native code, right?

This is actually one of my key arguments for natively-compiled interactive languages: it dramatically shortens the distance from "proof of concept" to "robust, production quality."

I agree

I agree... if it isn't already, get your code up to production quality, and then convince your boss you can either deploy production quality code now, or take a year re-implementing in C++.

ML programs have a tendancy, on average, to be a bit more robust than C++ programs. Not to mention that they are automatically portable unless you specifically do something that makes them non-portable, such as calling C libraries.

(I must admit, the SML 97 language definition is one decided advantage that SML has over Haskell.)

I must admit, the SML 97 lang

I must admit, the SML 97 language definition is one decided advantage that SML has over Haskell.

How so? Especially given that "mere" unextended SML seems not to be as popular as using an extended version.


The SML definition is more rigorous than Haskell's, and in my experience, SML is a more stable language and there is greater consistency across more implementations, and versions thereof.

I remember making some code one of my advisors wanted to use in class work from an older version of hugs to a newer. I don't recall exactly what it was, but I remember the problem was that the code used end-of-line comments immediately followed by an asterisk... which was a lexical inconsistency between the versions.

Not to mention that there are more implementations of

ML (FP) vs "real world"

My project (~24 C++/Java developers) is a very large mission critical, soft-realtime, multi-year effort. I've made progress integrating Tcl into the (highly C++ based) system with great success (it was a prototype for a message checkpointing server that performed so well we never rewrote it in C++). Running over 1 year w/ no downtime so far... Any replacement for the Tcl app must best it (the app is network bound, so increased code peformance isn't as important as robustness and reliability).

The Concurrent ML work was done for the prototyping the next generation of the system and will also serve as a model (or perhaps an informal specification?) for a core part of the new system. It will be interesting to see how long the estimated schedule will be for the C++ equivalent (but thats likely at least 6 months away).
And since the ML apps will be fully functioning, it will be used for comparison testing throughout development (the C++ app must equal it or do better).

I'm quietly (shhhh...) building up grass root support for using ML (got one developer hooked). I am responsible for the overall system architecture/design of the next generation of the system (which is why I spend most of my time prototyping stuff rather than writing production code), so I am introducing stuff like SPIN (for model checking the whole system) and ML for prototyping new components.

So, while I doubt that the project will be deploying much ML in the future, it may be an integral part of its specification and a model for its performance.

Random testing

Might I suggest that for any ML app that does have to be rewritten in C++, you use the ML code for random testing of the C++ application - identify functions or operations in the C++ version that correspond to ML code, and flag as a bug any difference in operation between ML and C++?

reminds me

didn't Paul Graham make a comment regarding Dynamic languages to the effect that they are always being used for prototyping but the final implementation in the mainstream languages hardly ever seems to work as well as the dynamic prototype?

Not to flame ML or overly praise C++, but...

... C++ has numerous alternatives to multiprocessing/multithreading for handling simultaneous I/O connections. (Assuming, of course, one is willing to venture beyond the standard library and into OS-specific libraries; OTOH if you are discussing multithreading you have done this already).

I/O multiplexing (select or WaitForMultipleObjects) is one way to do it.

Asynchronous I/O is another.

Coroutines ("Lightweight processes" on many Unix systems, "Fibers" on Windows) are a third. (Just because C/C++ doesn't have coroutines in the standard hasn't kept OS suppliers from adding 'em to the language.)

After all, oodles of highly-concurrent, I/O bound middleware have been coded in C++ (including lots of high performance apps like RDBMS's, webservers, and such); many instances of these things scale well beyond what one might expect with a one-thread-per-connection implementation.

Selection Criteria

OK, so given that we have Concurrent ML and a program written in it, and C++ and library X for concurrency, what's the motivation for rewriting? Note that "because no one else knows Concurrent ML" is a reasonable response in a production setting; I just wonder whether there wouldn't be equal or greater benefit to Concurrent ML training for the team.

ML in production?

I'd love to see the team using ML. But, I'm taking baby steps. Its quite an accomplishment to see the team using Tcl (hey Cisco routers use it internally so it must be okay ;-)

TiVo uses Tcl, too

TiVo uses Tcl for its settop boxes.

If you have an existing working system...

...which meets your requirements: by all means, keep on using it. I might be tempted to choose a language other than the usual industrial suspects for such a project (though I'd probably look at Erlang over Concurrent ML for the sort of project you described).

Porting code from ML to C++ does seem kind of retrograde, after all... OTOH, such a port might be ordered if ML training is deemed more expensive than undertaking such a port.

My post was only intended to point out that the implication that C++ was limited in it's ability to do high-performance concurrent I/O is not true. A thread-per-connection approach would certainly suck bigtime; however C++ isn't limited to such an architecture.

Neither is Java

A thread-per-connection approach would certainly suck bigtime; however C++ isn't limited to such an architecture.

Just for a record, as some ancestor post mentioned both C++ and Java: Java is not limited to that as well (not since introduction of java.nio - most importantly non-blocking input/output). The interesting point is that most of popular Java "frameworks" didn't jumped to nio immediately. Was it a fear of problems in implementations of nio (not completely unfounded)? Was it an informed preference of stable framework codebase over potential performance/scalability benefits?


"C# is probably picking up some PHP users"

err... i would guess Ruby is taking on PHP by storm given the Ruby on Rails web framework and the fact that it's a much better designed and incredibly modular language. You can see it doesn't appear anywhere: they just cite PHP, Perl and Python... where's the R in LAMP?

C# and ASP.NET is probably picking up JSP developers or something, not people who know it better...

The paradox of operator overloading

It both enhances and denies readability. It enhances readability when the overloaded operators are those commonly used in the domain. Having to use the names add, subtract, multiply, divide and prefix notation (in an infix languages) for bignums or complex numbers is just awful. Having operators, overloaded or not, for string concatenation and pattern matching is sweet. Letting += mean "add to a collection" is pretty good once you get used to it. Random use of random operators for random classes is awful.

The poor language designer just can't win.

Ed: Did the above get posted to the wrong thread???

...seems like it belongs on the recent operator overloading thread.



Had something similar happen the other day

Thought it was just my mistake. But perhaps the weblog software has a problem somewhere. Either that or it's two cases of PEBKAC. :-)

Yes, I fat-fingered it.

I had several different tabs open on different LtU Forum pages, and entered my comment in the wrong one.

where are they going.

I wanted to question this study, but a quick google seemed to show that EDC is reputable, and not in M$'s pocket. But what people are considering instead of PHP is a big question. It would be exciting if Ruby could be the next big thing. but I don't know anything about it (interms of the spread of it's adoption).
Anyone care to enlighten me?

Let's hope the Next Thing is better than Ruby

Over the last year I've been doing a lot of Ruby as I convert an inherited website from PHP. It was about a year and a half ago that I decided to leave Java for something better, and I ended up with Ruby, though I wouldn't be surprised if I would have been just as happy with Python. I picked without any real knowledge of the two; Ruby just felt a little more "OO" to me at the time.

But Ruby, for all of its good things, has an astounding number of faults. The entire language itself, and much of the standard library, is written in C, not Ruby, which doesn't say a lot for Ruby's power. The type checking is dynamic only, and the nature of the language is such that there's really not much hope of changing that. There are a lot of obscure syntactical and semantic tricks to the language; it doesn't look as if a lot of thought was put into the language design, especially in terms of keeping it orthogonal. And there are too many different ways to say the same thing, which makes the language harder to learn and leads to wildly varying programming styles.

Ruby's got some good stuff in it, particularly in terms of support for building domain-specific languages within the language, but I think we can do a lot better than this.

Just speed, I think.

The entire language itself, and much of the standard library, is written in C, not Ruby, which doesn't say a lot for Ruby's power.

I think it only says something about Ruby's speed as compared with C, if you're talking about having a core Ruby which the rest of Ruby is built on. If you're talking about writing a Ruby compiler in Ruby, well, Ruby only compiles to bytecode anyway, right?

Will your comment still be relevant when (if?) Parrot is done and Ruby is implemented on it?

(I don't know a whole lot about Ruby, having never used it in anger.)

It appears to me that there a

It appears to me that there are various functions in ruby that cannot be implemented in ruby. Object.instance_eval would be an example of one.

Dog food

Turtles all the way down, from Avi Bryant, discusses some of the features that can't be accomplished in Ruby because of the reliance on non-ruby libraries.

There's more than one way to do it

Ruby was influenced by Perl, so it's not surprising that it supports varying programming styles.

say again?!

"The entire language itself, and much of the standard library, is written in C"

Ruby, the _language_, written in C??!

Anyway, that's the point of scripting languages: to serve as a fast and flexible glue for natively compiled software. That way, the speed bottleneck is almost eliminated, since most of the operations and libs you use in such languages are just high-level wrappers for primitive C functions in libs.

"The type checking is dynamic only"

Well, that's why Ruby is a great representative of dynamic/latent typed languages as opposed to statically typed languages like C++. If you've been roaming around for some time, you may have already noticed that both camps have a lot of fans...

Besides, it is exactly this very feature that makes Ruby ( and Python and others ) so incredibly flexible and allowing so many sweet non-ortodox programming "tricks"...

"There are a lot of obscure syntactical and semantic tricks to the language; it doesn't look as if a lot of thought was put into the language design, especially in terms of keeping it orthogonal. And there are too many different ways to say the same thing, which makes the language harder to learn and leads to wildly varying programming styles."

it comes from its Perl heritage, though as Ruby is becoming more popular and being discovered by javaheads, Matz ( the main designer ) is intendind to remove some such convenient syntatic sugar. I think it's a shame that in order to make a language popular, it's gotta pay lip service to small thinking people...

"I think we can do a lot better than this"

well, just spill the beans then, genius...

"I think we can do a lot better than this"

My guess is that he's thinking of Smalltalk.

except he doesn't seem happy

except he doesn't seem happy with the dynamic typing...

Then maybe he should not have

Then maybe he should not have used a "scripting" language, as pretty much all of them are dynamically typed. Anyway, everyone knows that real men write programmes in strongly-typed languages like C, rather than wasting time with scripts.

"Anyway, everyone knows that

"Anyway, everyone knows that real men write programmes in strongly-typed languages like C"

said the man who loves Scheme.

and btw, Ruby and Python _are_ strongly typed, just dynamically, though...

and btw...

and btw, C isn't strongly typed.

Ah, but after a 53-hour debug...

C might be typed strongly. ;)

and btw, Ruby and Python _are

and btw, Ruby and Python _are_ strongly typed, just dynamically, though...

Except that your definitions of strongly typed probably vary greatly from other people's definitions. If we take a poll, we'll find that many folks have subtly different and frequently incompatible definitions of "strong", "static", "weak", "dynamic", "latent", "inductive", "favorite buzzword" testing. As a result, people argue in circles about these concepts and never seem to get anywhere but fail to realize that they're using the same words for different things.

type safety

And that's why we talk about type safety - not strong/weak typing.

So what is strong typing? Programming Languages: Application and Interpretation p205

well, just spill the beans th

well, just spill the beans then, genius...

That's pretty confrontational. Why don't you relax, have a cup of tea, and tone it down a bit?

I don't know yet what language is going to be better than Ruby for my purposes, though I'm actively investigating that right now. I think that spending some time doing a serious, reasonably large project in Ruby that is similar to one I'd done previously in Java has been a valuable experience; I now have a much better understanding of what a "scripting" language is and what its limitations are.

Ruby gets compared with Smalltalk quite a lot; it's interesting to look at the differences. The good side of the "scripting" languages is that they're much better integrated with the operating system they run on; the traditional Smalltalks are another world. (Gnu Smalltalk may fix some of this; I've not yet investigated this, though.) The bad side is that, unlike Smalltalk, which in most cases is self-hosting on a very small kernel, the scripting languages tend to be heavily dependent on C code to do some of the heavy lifting that they're incapable of doing themselves.

But I don't see these two things as being inherently mutually incompable, as opposed to accidents of implementation. Is it bad to ask for both?

Another area is the dynamic typing. Many folks give the strong impression, if they don't say so outright, that dynamic type checking is not compatable with static type checking, and you just can't have both in the same language. I don't buy this proposition at all. I think a really powerful language is going to do type checking as early as possible, doing it at compile time when it can, and at run-time when it has to. (In fact, languages such as Java already do this.) And I see no reason you can't have the ability to declare type constraints where useful while still not forcing the programmer to declare a type constraint for everything (something that Java has not achieved). And, to go further, I see no reason why you couldn't have type inference on top of that.

sounds familiar

I think a really powerful language is going to do type checking as early as possible, doing it at compile time when it can, and at run-time when it has to

As previously discussed on LtU?
Static Typing Where Possible, Dynamic Typing When Needed

Or just decade old Dylan's optional type declarations?

languages such as Java already do this
Sound familiar, previously on LtU "By definition, it's what dynamically type checked languages don't try to do - static type checking.

Everytime I see the this dynamically checked language versus statically checked language dichotomy, I have to remind myself that the statically checked language may also use dynamic type checks."

C buys you lots of libraries

The huge advantage of having a system written in C is that, if the language/C interface is done reasonably well, you potentially get access to all kinds of libraries that export a C API. Living in "another world" is of little practical benefit for most users of scripting languages.

And what, you can't get acces

And what, you can't get access to things written in C without writing your language itself in C, and without making your langage not powerful enough to be self-hosting?

I think this is another example of a false dichotomy: you're trading off something you don't need to give away in order to get linkage to C libraries.


Why do you care if the language is self-hosting or not? I assume Ruby could be self hosting, but just would be rather slow. The point of scripting languages (as described in Ousterhout's original paper) was not to replace C or other fast compiled language implementations. Rather, the observation was that many programs written in compiled languages end up looking quite a lot like an interpreter. It would be nice if you could use the same language for both—if the parser and evaluator for your language was part of the standard library for instance—but I don't know of any language where this is the case.

What about lisp?

Nearly every Common Lisp is compiled but also exposes eval. It will also expose compilation. So there is at least one counter example.


Yes, I should have thought of Lisp. I wouldn't characterise it as a counter-example, though. I was arguing that this is a good thing.

Not every language is as flexible as Lisp, though. Perhaps the real point here is that scripting languages tend to be implemented in languages like C and Java not so the scripting languages can get access to vast libraries, but rather that C and Java would be so crap as interpreted languages that people create scripting languages on top of them to make up. (Assuming that you agree that many applications end up looking like interpreters).

I'm not really sure anymore where I'm going with this argument, except to say that it would be interesting to see a comparison between a language like Lisp and a combo of e.g. Tcl+C for whole application development. (Rather than comparing Lisp to just Tcl or just C).

Yes, but...

Sure you can get access... perhaps it's more of a cultural thing, though. People happily go to work adding interfaces to all kinds of C libraries from their favorite scripting languages. On the other hand, I've perceived a reticence to this approach from languages like Lisp or Java that have a strong culture of wanting everything in X, all the way down.

Perhaps it's an instance of 'worse is better'... while the Lisp guy is developing a beautiful image manipulation system that feels very Lispy and performs well, the Tcl guy is busy using the Tcl interface to ImageMagick.

Are foreign function calling systems used heavily out there? It's something I've never really cared for, for some reason... I like the dual language approach - writing C in C, instead of trying to write C in Tcl/Python/whatever.

Holy Grail

It's probably worth reiterating that good Common Lisps do this: infer as many types as possible, allow annotation when it's either necessary to resolve ambiguity (very rare) or when it helps make performance objectives clear (still rare, but not quite as much), and when all else fails, punt to totally generic implementations (and print a warning that a potential optimization failed).

Really, the program I think most of us on Lambda are interested in is one in which we have an expressive inferred type system but without erasure. We'd like lots of safety but we'd like not to give up reflection or even "I know what I'm talking about even if the compiler doesn't". It's an open area of research where these lines get drawn, and that, in the end, is why we're here.

A VM for Ruby

You're right on some points.

Ruby have no Virtual Machine, and this influence the
design of the language, since you don't have a clear model/bytecode to target, you can easily end up adding unorthogonal features when updating the interpreter.

I've met Ruby community people this week at OSCON, as well as "Matz" (Ruby creator) and they're really nice folks. I'll release soon the Neko Virtual Machine and after that start working on trying to run Ruby on top of it. If success, this might end up being the main Ruby implementation, and might fix some of the language semantics problems then.

This will not change that the language is dynamicly typed and that some people like it, and some doesn't.

Numbers Do Not Include U.S.A., India, China, etc.

As clearly stated, the survey does not include North America and Asia Pacific regions, which dominate software development.

In judging trends world-wide, consider this graph of PHP usage from Netcraft.