Functional Object-Oriented Programming

I've been experimenting, in Ruby, with something that we may loosely call 'functional' object-oriented programming. I write objects with non-mutating methods -- basically, these objects are stored 'requests' of a certain type. Various values can be derived from the request on demand. If I ever decide to implement caching, well, that wouldn't be functional but it's hidden from the user.

Whenever I need actual state in my application, I make sure to split it out from the classes. Instead, I pass stateful information from method call to method call until the result is complete -- saving myself the hassle of state-tracking and so forth.

I'm sure there is some efficiency penalty for this approach, though it makes my life much easier (and more reproducible!). There is a similarity between what I am doing and 'continuation passing style' though I can't quite place my finger on it.

What do you all think of this style of "functional object-oriented programming"? There are quite a few approaches fucntional object-oriented programming out there already, like FOOPS and Object-Gofer; but these proposals are about faking object-oriented semantics -- a stateful object -- with monads.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

store-passing style

This style is known as "store-passing style" or "world-passing style."

You're right to notice a connection to continuation-passing style. In CPS, you pass a representation of the program continuation as an *explicit* argument, rather than using the implementation language's native control mechanisms (that is, you don't use the implementation language's stack and only perform tail calls, i.e. jumps). In store-passing style, you don't use the implementation language's store (a.k.a. heap) so you explicitly pass a representation of the store around as an argument.

The fact that you're doing this in an OO language is pretty much orthogonal. The choice of FP versus OO representation of data relates to what's known as the expression problem (c.f. relevant papers on the expression problem). But, at some level, this choice doesn't have much to do with whether you choose to use mutation or not.

(Of course, in practice, most popular OO languages -- and in particular their standard libraries -- so heavily favor mutation that you pay significant practical costs in avoiding mutation.)

afterthought

Incidentally, think of Erlang as a functional object-oriented programming language and see where that gedankenexperiment gets you.

Old LtU Department

We used to have an Object Functional Department on LtU classic that might be of interest along these lines.

As far as cache requiring mutable state, it should be mentioned that there are variables that fall between the category of non-mutable and mutable - speccifically dataflow variables (aka promises/futures or write once variables). A cache value doesn't need a state variable so much as a logic type variable that is written once and then never changed from that point forward.

New department

There's also a new version of that department.

As far as cache requiring

As far as cache requiring mutable state, it should be mentioned that there are variables that fall between the category of non-mutable and mutable - speccifically dataflow variables (aka promises/futures or write once variables)

Promises never struck me as a new "mutability" category, so much as a scheduling decision. Are there properties of promises other than synchronizing execution that warrant declaring a new category?

In some languages

you can "inspect" to see a promise to see if has been fulfilled without blocking (or alternatively, do a non-blocking wait or a wait with a timeout).

Both of these operations violate referential transparency, as the promise can be observed in different states at different times. As always, see CTM for an excellent discussion.

Another consideration...

...is that it's possible that the code may try to fulfill the promise more than once - generating a runtime exception.

What others are there?

Maybe this is covered in a thread somewhere, but what functional/oop languages are out there? These are the ones I know of:

F#
Nemerle
SML.net
MLj
scala

These one target either the clr or java platforms. Then there's ocaml. How do these compare to one another? Opinions?

edit: not quick enough...

O'Caml

But you mentioned that already.

Some might include C++ in the mix... :)

Some might include...

Some might include C++ in the mix... :)

...and they surely have no clue about FP.

Some even argue that they have no clue about OOP either... :)

Eh...

be careful, Andreas. :-0

Care is the right keyword :)

Sure, I am aware of Boost & Co. But do you really consider it feasible to write, say, a combinator library with it? And have an average mortal using it without producing nice piles of memory bugs or space leaks?

Average mortals

have no clue what a "combinator" is.

And this includes, sadly, many graduates from CS programs, especially those who stop at the baccalaureate level.

it's been feasable for more than a decade

c++ combinator libraries and the corresponding technique expression templates have been developed for more than 15 years.

the c++ report magazine first published about this in 1995.

Sure, that is clever template hackery

Sure, that is clever template hackery. AFAIK, it also is superseded by more recent work you find in Boost.

But until it integrates with language and library without requiring you to first define a reified version of every single operator and function of interest *and* you can freely pass expression objects up and down without risking memory bugs, I don't see it qualifying as functional programming.

Also note that the examples in the paper do not yet make what the FP crowd usually calls a combinator library. Combinators are higher-order abstractions that would correspond to expression objects evaluating to expression objects. Has anything like that been used successfully?

More importantly, the ability to encode certain FP *idioms* does not turn C++ into a functional *language*. The latter was the claim I was refuting.

*and* you can freely pass

*and* you can freely pass expression objects up and down without risking memory bugs, I don't see it qualifying as functional programming.

Does manual memory management really preclude classifying a language as functional? Or is it the safety of the memory management that's the clincher? Personally, the argument that FP idioms does not make a language functional does not follow, since if FP is the *primary* idiom of the language even while allowing other idioms, it certainly strikes me as functional (even with manual memory management).

Invoking Turing equivalence

Does manual memory management really preclude classifying a language as functional? Or is it the safety of the memory management that's the clincher?

Both (and how can you separate the two anyway?). It is a matter of practicality. AFAICT, manually managing the memory of plethoras of small closures that you push around and compose quickly becomes infeasible. In practice the circumstances will hence preclude wide-spread use of respective idioms. As some evidence, we certainly do not see this used much and in sophisticated ways in C++, despite the existence of some fancy libraries.

Personally, the argument that FP idioms does not make a language functional does not follow, since if FP is the *primary* idiom of the language even while allows other idioms, it certainly strikes me as functional (even with manual memory management).

Not sure I understand your line of reasoning here. If it is the *primary* idiom, sure. But if it requires heavy encoding and is not supported "naturally"? This is the case with FP in C++ or Java - and vice versa I would not call, say, SML object-oriented, even though it is straightforward to mimic objects by records of closures.

A language can meaningfully be called {functional,oo,whatever} if it *actively encourages* use of {functional,oo,whatever} idioms - by its syntax, semantics, library. Everything else is moot by Turing equivalence arguments.

Both (and how can you

Both (and how can you separate the two anyway?).

malloc is manual but unsafe; region-based memory management is manual and safe.

AFAICT, manually managing the memory of plethoras of small closures that you push around and compose quickly becomes infeasible.

Right, which would probably lead you to regions, or some similar aggregation technique.

If it is the *primary* idiom, sure.

Yes, that was my point. But then, consider that in a multi-idiom language the "primary idiom" is often merely what the designers use to build the standard library.

A language can meaningfully be called {functional,oo,whatever} if it *actively encourages* use of {functional,oo,whatever} idioms - by its syntax, semantics, library.

I would say that's maybe a bit too strict; I see a functional language as one that permits functional idioms, and whose syntax and semantics do not interfere with such an expression. So C++ and Java are not strictly functional, though they allow some functional expressions, because the expression is awkward and unnatural, but it includes something like Javascript that doesn't necessarily *encourage* functional idioms since most of the core and the standard library is object-based.

Regions manual?

region-based memory management is manual and safe.

I must be missing something. In what sense is it manual?

I see a functional language as one that permits functional idioms

So all we disagree about probably is where to draw the line on the scale between "encourages" and "permits". For me, "permits" sounds too weak, because somebody might argue that assembly language certainly "permits" both FP and OO idioms - they'll just look very awkward.

But then, consider that in a multi-idiom language the "primary idiom" is often merely what the designers use to build the standard library.

I think you call a language multi-paradigm precisely when it encourages several paradigms, more or less equally strong. (The word "primary" was yours, not mine.)

I must be missing something.

I must be missing something. In what sense is it manual?

Hmm, I didn't expect contention over this. Without region inference, I had thought that allocation and deallocation of regions was fundamentally manual. See for instance, Typed Memory Management in a Calculus of Capabilities. However, I see hints in the above paper that regions as originally proposed are scoped, so perhaps I'm misremembering. Indeed, Cyclone seems to use a runtime to manage region lifetimes.

For me, "permits" sounds too weak, because somebody might argue that assembly language certainly "permits" both FP and OO idioms - they'll just look very awkward.

It does "permit" it, but I would definitely consider assembly as violating the second condition of my "functional" classification: syntax and semantics do not interfere with functional expression. Perhaps I should amend that last part to a "straightforward function expression", or "natural functional expression" to make that more clear.

you originally wrote: But do

you originally wrote:

But do you really consider it feasible to write, say, a combinator library with it?

i was just stating that it is both feasable, and has been done in practise for a very long time.

Combinators are higher-order abstractions that would correspond to expression objects evaluating to expression objects. Has anything like that been used successfully?

the previously mentioned spirit parser library, or the scientific computing library blitz++, are examples of "successful" c++ combinator libraries.

More importantly, the ability to encode certain FP *idioms* does not turn C++ into a functional *language*. The latter was the claim I was refuting.

c++ is a multi-paradigm language by design, and definitely not one with emphasis on functional.

As you asked for opinions: I

As you asked for opinions: I didn't like Ocaml syntax (though F# seems much better) and the tutorial I've read about Ocaml made me thought of Ocaml as a mostly functional language with object sugar here and there (it may be a bias induced by the tutorial though).

Scala seems to be a very interesting mix of functional and OO style.

I don't know the other language so I cannot comment.

I would say...

...that the tutorial is indeed biased. O'Caml's object system includes classes, inclusion polymorphism (inheritance), parametric polymorphism (generic or "template" classes), multiple inheritance, mutable object fields, method overriding, etc. About the only thing that C++/Java/C# programmers coming to O'Caml might expect to find that isn't there is downcasting: the O'Caml developers are quite clear that the visitor-pattern-and-downcast pattern is a type-unsound answer to not having variant types in the language, and since O'Caml has variant types, they eschew downcasts.

Also, thanks to proper typing of binary methods and open self types, O'Caml doesn't suffer from the Expression Problem. With Scala and Sather, it's one of only three statically-typed languages that I know of with this property.

Multiple too! Cool.

[...] inclusion polymorphism (inheritance), parametric polymorphism (generic or "template" classes), multiple inheritance, [...]

I didn't even realize that OCaml supported *multiple* inheritance, and I've been using it on and off for a few years now. That's super! Learn something every day... :-)

Previous threads.

A previous thread, Type inference in object oriented languages, might give additional insights.