What's up with Lua?

Is Lua the next big thing (see here for one example)? What's up with that?

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

No, but...

it's solidly in the low twenties of the TIOBE list, implying that it's about as easy to get a job programming Lua as a job programming Prolog or Ada. Not bad for something most people know from World Of Warcraft macroing.

Probably Never Going to Blow Up

Lua just sits kind of low on the radar and continually grows in it's niche.
That it's actually aimed at that niche though means it will most likely never be the next big thing. For example it doesn't do regex, because a proper regex implementation would take too much space. So it has its own, non standard expression engine. The small number of data types probably wouldn't sit too well with programmers at large. Just little things like that kind of slow adoption.
People tend to see it as an extension language not a competitor to say ruby or python.
Personally I really like lua, the prototype approach to OO and the transparent merging of arrays and tables appeals to me. I actually use it for most of my day to day hacks (even if i would prefer a curly bracket syntax).
but for most people its focus on embeddability results in some feature not being included that prevents it from being a compelling alternative to "bigger" languages. as well of course as the library chicken and egg problem.

Lua and pattern matching

it doesn't do regex, because a proper regex implementation would take too much space

I'm not sure who defines what a "proper" regex is ;) but Lua's patterns are adequate for most tasks that I've encountered. They include non-standard but useful features such as matching balanced delimiters and capturing character positions, and the pattern-related functions integrate well with Lua's iterator syntax, making many tasks smoother in Lua than even in Perl.

And if Lua's patterns aren't up to the job, you can move up to Parsing Expression Grammars.

Pattern Matching

Well thats the thing, they put in a fairly powerful expression matching engine, and from what i've read on the mailing lists it's actually capable of a few things most regex isn't, and it's all of 400 lines of code.
BUT it doesn't obey true regex syntax (okay "true" isn't right, COMMON (POSIX/PCRE)) and it also can't do EVERYTHING that regex can.
so it's a subtly new syntax you need to learn. that, if you're already used to regex, will probably end up annoying you.
now that's not to say you can't do regex in Lua, Lua forge has a regex lib call lrexlib i think, that's complete and uses formal regex syntax, but it is a seperate library (like Lpeg)
It's kind of like the whole white space issue in python. It's really pretty much a non-issue, but it's just different enough to annoy people who expect things to be like what they're used to.

Lua Deserves More Mad PLT Props!

Lua is quite a nice little language suitable for easy extension and even easier embedding.

This is a very timely topic for me, as I am currently weighing the pros and cons of using Python versus Lua as an extension engine in a product for work. Lua would pretty much be a slam-dunk, except that the rest of the team knows Python already and had never heard of Lua before I mentioned it (well, except for one guy who is a World of Warcraft addict.)

In many ways, Lua is like the theoretical "Scheme with ALGOL syntax" that is supposedly all the world has been waiting for before it would throw wide its arms and embrace functional programming. Perhaps being of industrial (rather than academia) origin has hampered its recognition as a programming language of sufficient interest to researchers. If so, this is a shame because Lua has:

  • First-class functions
  • Anonymous Functions
  • Proper tail call elimination
  • Closures and Lexical Scoping
  • Coroutines
  • Prototype-based Object System
  • Tiny Kernel with Easy Extension

The main thing creating my cognitive dissonance is that most of my potential users are likely to be familiar with Python and are not aware of Lua. I am also impressed with Python's many existing libraries for handling matrix mathematics and graphics. No such infrastructure has been developed for Lua (but please tell me if I am wrong!)

Perhaps being of industrial

Perhaps being of industrial (rather than academia) origin ...
Huh?

Lua is designed and implemented by a team at PUC-Rio, the Pontifical Catholic University of Rio de Janeiro in Brazil. Lua was born and raised at Tecgraf, the Computer Graphics Technology Group of PUC-Rio, and is now housed at Lablua . Both Tecgraf and Lablua are laboratories of the Department of Computer Science.

"academic"

Well, I think it's fair to say that "academic" in the PLT sense means either Friedman/Felleisen/Scheme, Wadler/Peyton-Jones/ML/Haskell, or the equivalent for SML or (maybe) O'Caml. PUC-Rio just doesn't have the breadth of publication and indoctrination necessary to become known through normal academic channels.

Oz?

I think you're implying a far more cliqueish PLT academia than really exists. Where does Mozart/Oz fit into your categorization? Is it academic or industrial? How about Scala? Nomadic Pict?

Survey Says: Academic?

In most of these cases, my impression (perhaps an incorrect one) is that these tools have been created to explore different aspects of programming language theory, but were not originally created to attack industrial problems.

In many cases, the 'real world' side of things seem to be post hoc efforts; tools in search of problems. The fact that some of these languages happen to be used for industrial work often seems an incidental byproduct of the original research effort. Certainly no one would object to their language being used for industrial efforts, but I think a lot of the work required to turn an academic toolbox into something suitable for industrial use is uninteresting (and more importantly, unfunded!) and so tends not to get done unless a suitable cadre of followers eventually chip in and fix the threading, or memory allocation, or edge conditions.

I think we are so lucky to have so many great platforms available for exploration and building. My biggest problem is the wealth of choices available!

A little less circular than that

From The First Report on Scheme Revisited:

The principal themes of this series [of languages] were complex data structures with automatic pattern matching, and complex control structures with automatic backtracking. These languages were specifically envisioned as tools to support explorations into theorem proving, linguistics, and artificial intelligence, in much the same way that Fortran was intended to support numerical computation or COBOL to support business applications.

These weren't necessarily industrial problems (although the AI boom was an attempt to change that), but they weren't just aspects of programming language theory, either. Also, Sussman was always interested in applications in engineering and physics, and wanted a language that was suitable for expressing problems in those areas. SICM demonstrates this.

Similarly, ML was originally developed to support theorem proving. Haskell was developed to solve the problem of replacing Miranda. ;)

Precisely

In most of these cases, my impression (perhaps an incorrect one) is that these tools have been created to explore different aspects of programming language theory, but were not originally created to attack industrial problems.

That's my impression as well. But that was pretty much my point: I consider all of the languages I listed to be academic in origin (and in most cases still largely academic in usage), and yet as far as I know none of them fall either into the Friedman/Felleisen/Scheme or Wadler/Peyton-Jones/ML/Haskell camps that sean alluded to.

Point?

Well, I think it's fair to say that "academic" in the PLT sense means either [...]

This makes no sense to me.

Regardless of what you might have in mind, it certainly doesn't make PUC-Rio "industrial". As Brent's post points out, the PL theory roots of Lua are pretty obvious, much more so than for any non-academic language, with a few possible exceptions such as Javascript.

Lua seems to have succeeded very nicely at exploiting PL theory in a particular niche. It's not only the PL theory that's led to Lua's success, of course -- good engineering and even marketing (in the sense of targeting a niche, at least) is also a big part of it -- but the theory certainly seems to have helped.

rephrasing

Let me (hopefully) clarify what I meant. "Academic" and "industrial" aren't exhaustive; "obscure" is another category. For better or worse, languages not connected to a few big PLT programs are obscure, and therefore have little impact on the course of research. It's the same in any field. Maybe the OP meant "industrial application" rather than "industrual origin."

I assume that by "PL theory roots" you mean "first-class functions and coroutines," in which case a number of languages have such roots. Lua seems nice from a brief glance, but it's a niche language, and IMHO will likely remain so.

the PL theory roots of Lua

the PL theory roots of Lua are pretty obvious

Hence my interest, of course ;-)

Facts, not flames

Lua 1.0 was developed as a data-description language for Petrobras, the Brazillian oil company. It's hard to be any more "industrial" than that.

PL theoretical features were added gradually as the language evolved -- prototypish objects in 2.0, anonymous functions in 3.1, proper lexical scoping and coroutines in 5.0... At this point, Lua is (like JavaScript) a nice fusion of Scheme and Self. But unlike Scheme or Self, it's never been a "research language" -- the focus all along has been on providing a practical, embeddable language engine (a la Tcl).

Highly recommended is the authors' HOPL paper, An Evolution of Lua.

Incidentally, one of the reasons (IMHO) that Lua is such an elegant language today is that it has discarded an awful lot of cruft along the way. With each major version, a number of features were thrown out to make way for the "new way of doing things". This was possible because Lua is typically embedded (so if new Lua breaks your code, just don't upgrade it for that app), but also (again, IMHO) because Lua isn't mainstream, and the smaller userbase is more tolerant of breaking changes. Conversely, JavaScript is the mess it is today because it went big so early, and has to support all of the mistakes made in the early days.

Also incidentally, one of the defining of features of Lua (all along) that hasn't been mentioned in this thread is extensible semantics. In Lua, reading from or writing to a table can mean anything you want it to mean. So, lots of fancy features (inheritance, laziness, autocreation of subtables, functions with private state, memoizing, currying) can be implemented by the user in just a few lines, instead of having to be built into the language core.

Hybrid

Lua 1.0 was developed as a data-description language for Petrobras, the Brazillian oil company. It's hard to be any more "industrial" than that.

I stand by my claim that PUC-Rio is not "industrial". :) Lua is an unusual case, but it's the kind of case I hope we'll see more of.

PL theoretical features were added gradually as the language evolved

Perhaps my use of the word "roots" was misleading, but what I meant was that the authors of Lua have clearly studied PL theory and applied its lessons well to the design and implementation of Lua, and that's been true for some time, not just in the most recent major versions. The references in the Lua papers to academic work on implementation of closures and continuations, for example, make it clear that it's not a coincidence that Lua has managed to get these kind of things right. I don't know of a mainstream language that has such a good story in this area, and to me it's clear that the application of theory has a lot to do with that.

I Stand by my Claim! :-)

Please understand that I meant no insult when I said that Lua was of industrial (rather than Academic) origin!

While I'm not trying to deflate Roberto Ierusalimschy, Luiz Henrique de Figueiredo, or Waldemar Celes's acumen or education, I feel like Lua was the result of an attempt to solve an real-world industrial problem, rather than an effort to generate a new programming language from first principles. They were simply trying to solve a problem:

The Beginning
Our first experience at TeCGraf with a language designed in-house arose in a data-entry application. The engineers at PETROBRAS (the Brazilian oil company) needed to prepare input data files for simulators several times a day. This process was boring and error-prone because the simulation programs were legacy code that needed strictly formatted input files -- typically bare columns of numbers, with no indication of what each number meant. Of course, each number had a specific meaning, which the engineers could grab at a glance, once they saw a diagram of the particular simulation. TeCGraf was asked by PETROBRAS to create several graphical front-ends for this kind of data entry. The numbers could then be input interactively, just by clicking at the relevant parts of the diagram -- a much easier and meaningful task than editing columns of numbers. Moreover, it opened the opportunity to add data validation and also to compute derived quantities from the input data, reducing the amount of data needed from the user, and increasing the reliability of the whole process.

To simplify the development of these front-ends at TeCGraf, we decided to code them all in a uniform way, and so we designed a simple declarative language to describe each data entry task [12]

It's very interesting that the resulting language (after several revisions) now contains many elements from functional programming, as well as fully encapsulating the lambda calculus. It makes me think of some observations made by our own Dr. Van Roy; perhaps truly useful languages do converge on similar functionality.

Inevitability

The Lua designers don't claim to have reinvented functional programming features, afaik, so the convergence in this case seems to have been a deliberate choice to move in a certain direction.

That said, to paraphrase a famous saying, "those who do not understand the lambda calculus are doomed to reinvent it, poorly." The second you put first-class functions in a language -- and Lua has had those from early on -- you're pretty much doomed to implement a complete lambda calculus eventually, or else come up with excuses for why you're not going to. The first language to go through this process was Lisp, with the funarg problem. More recently, languages like Python and Java have struggled to provide restricted lambdas without introducing semantic problems or unnecessarily limiting expressivity, but it's a losing battle. Ultimately, most possible restrictions in this area are arbitrary, and cause more problems than they solve. Lua is refreshing in that it's embraced the inevitable. :)

matrix mathematics and graphics

Someone told me:
"... maybe Brent should take a look at NumLua and LuaSDL. Or in general try a s/Py/Lua/ on some package names. ;-)"

Lua 2.5 in OCaml

Norman Ramsey implemented Lua 2.5 in OCaml. The implementation is part of the Quick C-- compiler, which uses it as an embedded language to configure the compiler. You can find the latest stand-alone release of Lua-ML here: http://www.cminusminus.org/rsync/dist/lua-ml.tar.gz.

If you are implementing in OCaml and looking for a configuration language, I recommend you check it out. If you are looking for an extended example besides Quick C--, you could take a look at my Quest tool.

Lua has actually been big for a while

But only as an embedded language. It has a definite niche as an embedded scripting language, especially for commercial video games.