LtU Forum

Fabula programming language

A few years ago I presented here the Aha! programming language (general-purpose). Not long ago I decided to switch to the web development area and took a more pragmatic approach. The result is a new programming language for developing web applications - Fabula. Moreover, I developed a Fabula interpreter and then a complete online programming system in Fabula. Those who are interested can have a look at fabwebtools.com. Any feedback is welcome.
Enjoy!

Subclass, superclass, or siblings under an abstract superclass?

The usual pattern in software development is to put the simplest thing as a superclass and things with more complex behavior as a subclass.

But what happens when the values of the so-called superclass are actually a proper subset of the values of the subclass?

For example, in a neural network program, there are two classes; "Neuron" and "Neural_Structure." A Neural_Structure has members comprised of zero or more input points and one or more output points one or more of which are also backpropagation feedback points. An example of a Neural_Structure would be a unit in a "Pooling layer", which takes three inputs and then forward propagates one of them - whichever has the greatest absolute value or the most-positive or the most-negative value.

And the immediate impulse is that the Neural_Structure should be a subclass of Neuron because it is the more complex object, and to make it you need to add methods and overload a bunch of Neuron's methods.

But Neural_Structure generalizes Neuron rather than restricts it. It's the difference between an upperbound and a lowerbound relationship on the type. A Neuron is a Neural_Structure having one input point and one output point which is also a feedback point.

And in most PL's, we don't have a mechanism that would allow us to say "class Neural_Structure generalizes class Neuron" and add methods or add abstract methods or overload a bunch of Neuron's methods, and then allow us to use a Neuron anywhere a Neural_Structure is required.

So I wind up with an abstract superclass Neural_Structure where Neuron and Pooling_Node are both subclasses, and that's obviously the correct structure under classical OO paradigms. But "generalizes" rather than "extends" would have been more efficient, because Neural_Structure adds overhead and handling that Neuron doesn't need but which Neuron inherits.

Thoughts?

Joining Forces: LVars & CvRDTs

More than a year old now, but I couldn't find it being mentioned on LtU. Apologies if this is a repeat / old news to everybody.

Joining Forces.
Although CvRDTs and LVars were developed independently, LVars ensure determinism under parallel execution by leveraging the same lattice properties that CvRDTs use to ensure eventual consistency. Therefore, a sensible next research question is: how can we take inspiration from CvRDTs to improve the LVars model, and vice versa? In this paper, we take steps toward answering that question in both directions: we consider both how to extend CvRDTs with LVar-style threshold reads and how to extend LVars with CvRDT-style inflationary updates, and we advocate for the
usefulness of these extensions.

PL vs. PX

I'm beginning to wonder if I'm in the wrong field. Many seems to be fixated on just language rather than resulting programming experiences (e.g. does this code look good vs. how was this code written?). The general sentiment of the field is clearly language-focused; e.g. take this post from pl-enthusiast:

From this vantage point, PL researchers tend to focus on developing general abstractions, or building blocks, for solving problems, or classes of problems. PL research also considers software behavior in a rigorous and general way, e.g., to prove that (classes of) programs enjoy properties we want, and/or eschew properties we don’t. ...

The ethos of PL research is to not just find solutions to important problems, but to find the best expression of those solutions, typically in the form of a kind of language, language extension, library, program analysis, or transformation.

So a focus on abstractions in the abstract, which is completely reasonable. But does is it really represent programming? Not really, PL doesn't seem to be about programming. It has applications to programming, but...

I’ve picked the three examples in the above discussion for a reason: They are an approach to solving general problems using PL-minded techniques in combination with techniques from other communities, like machine learning inference algorithms or cryptography.

So...PL is not about programming, rather it is a specific kind of theory field oriented around abstraction, which has applications for many other activities as well. In that case, my disillusionment with PL is just a matter of misguided expectation.

But that begs the question: what is a good academic home to talk about programming experiences, where PL is just a minor component of those activities? HCI? SE? None of those feel right.

After over two years and 1700 commits, the Nu Game Engine (the world's first practical pure functional game engine) reaches v1.0

The link - Nu Game Engine Release v1.0.0.0

The spiel -

Over two years in the making, and 1700 commits, the world's first practical, pure functional game engine, the Nu Game Engine, releases v1.0.0.0!

This release offer a greater guarantee of API stability than could be offered before.

I am hoping that Nu Game Engine will usher us toward an era of sustainable game development, because developer's lives matter, too! And if a game is torture to develop, its play experience isn't going to reach its potential.

Now that the days of being able to statically lay out the memory for an entire game are gone, it's time for developers to consider an alternative lifestyle where dynamism is a tool to be leveraged rather than eschewed.

Nu proves the efficacy and efficiency of game development with a pure functional API. Nu proves that hardcore optimizations like data-oriented physics engines, mutable spatial trees, and other computer-sympathetic data structures can be used transparently underneath a purely functional API.

Further, Nu proves that declarative programming style is also viable and sufficiently efficient for modern game development - Iterative Functional Reactive Programming with the Nu Game Engine

Whether or not people adopt Nu, I hope to at least offer it as proof that the prejudice against dynamism and functional programming in games at a high level is obsolete. And for those who don't believe, it's time to break out your profilers!

It's time for game developers to start having as much fun as we did so many years ago before we got run over by unbounded complexity!

@Moderator: Feel free to post to front page if deemed appropriate.

meta: September "propose a post" post (proposal)

I propose that once-per-month (or so), someone remember to create a "meta: <MONTH> "propose a post" post (proposal)".

There is no need for more than one "propose a post" topic per month (or so).

Within the comments of those monthly "propose a posts" posts people should follow the rules of a game.

The (voluntary) "rules" of the game:

(1) Top level comments propose "(new topic)" ltu posts that have not yet been made.

(2) 2nd level comments speak solely to the virtue of creating a new topic.

(3) No 3rd level comments and be brief, please.

(Just a proposal.)

Implementing "Elements of Programming" in Actor Script

In an effort to understand how ActorScript is to program with, and because I think Stepanov's "Elements of Programming" represents the best 'core' for any generic programming library, I want to try and translate the Elements into ActorScript. As there is no environment for actually testing code I thought I would post it here for comments from those who can tell if its correct or not. The idea is each top-level post should be a block of Elements code in ActorScript for comments. I will revise the top-level code according to the corrections and suggestions in the comments. I don't plan on using Unicode, but I will try and get the ASCII representations correct too.

The most obsolete infrastructure money could buy - my worst job ever

A funny article by Juho Snellman about really existing legacy software engineering and PLT.

For example on my first day I found that X was running what was supposedly [the] largest VAXcluster remaining in the world, for doing their production builds. Yes, dozens of VAXen running VMS, working as a cross-compile farm, producing x86 code. You might wonder a bit about the viability of the VAX as computing platform in the year 2005. Especially for something as cpu-bound as compiling. But don't worry, one of my new coworkers had as their current task evaluating whether this should be migrated to VMS/Alpha or to VMS/VAX running under a VAX emulator on x86-64!

Why did this company need to maintain a specific C compiler anyway? Well, they had their own ingenious in-house programming language that you could think of as an imperative Erlang with a Pascal-like syntax that was compiled to C source. I have no real data on how much code was written in that language, but it'd have to be tens of millions lines at a minimum.

The result of compiling this C code would then be run on an ingenious in-house operating system that was written in, IIRC, the late 80s. This operating system used the 386's segment registers to implement multitasking and message passing. For this, they needed the a compiler with much more support for segment registers than normal. Now, you might wonder about the wisdom of relying on segment registers heavily in the year 2005. After all use of segment registers had been getting slower and slower with every generation of CPUs, and in x86-64 the segmentation support was essentially removed. But don't worry, there was a project underway to migrate all of this code to run on Solaris instead.

F* (FStar) reworked and released as v0.9.0

I sure hope the aggregate effect of F*, ATS, Rust, and such is to more quickly bring about even better static checking.

v0.9.0 (released before ICFP 2015). Countless improvements, including an interactive mode, a new extraction mechanism to OCaml and F#, the ghost effect, a lot of new examples (micro-F* formalization, Wysteria, etc), hyper-heaps, build-config support, quite a bit of cleanup of the code base.

F* (pronounced F star) is an ML-like functional programming language aimed at program verification. Its type system is based on a core that resembles System Fω (hence the name), but is extended with dependent types, refined monadic effects, refinement types, and higher kinds. Together, these features allow expressing precise and compact specifications for programs, including functional correctness properties. The F* type-checker aims to prove that programs meet their specifications using a combination of SMT solving and manual proofs. Programs written in F* can be translated to OCaml or F# for execution.

The latest version of F* is written entirely in F*, and bootstraps in OCaml and F#. It is open source and under active development on GitHub. A detailed description of this new F* version is available in a recent draft. You can learn more about F* by following the online tutorial.

verified ML

Am I dreaming to hope that more verification is a good thing, and is on the way?

CakeML: a project that aims to make proof assistants into trustworthy and practical program development platforms. At the heart of our approach is a new dialect of ML, which we call CakeML. CakeML is designed to be both easy to program in and easy to reason about formally in proof assistants for higher-order logic.

I'm not saying nobody else is doing / has done cool verified stuff (e.g. cf. verified PreScheme, et. al.), I am just posting this because I'm excited that this project is apparently alive and kicking. I just want to be able to "sudo apt-get install" some day and start programming away, feeling like there's a little bit more safety-blanket feeling there.

XML feed