LtU Forum

Singleton classes really that bad?

There was some recent discussion on here that mostly claimed that singleton classes, assuming they had state, were bad (I'm not quite sure what a stateless singleton would be other than a different naming convention for global functions). I'm unconvinced =)

Stateful singletons still offer one advantage over straight global variables -- they have encapsulation. Granted, it's not any better than having a global object, but it does at least seem a little safer in that you can control access to it, cutting down a lot of the arguments against global variables in general. I'm not sure what nightmare scenarios people are thinking of when they say singletons can change unpredictably.

Can someone clarify?

I'm thinking of a class having C++ code something like this. For the sake of argument we'll say this is a design situation where we know we'll only need one of this class.

class DataList
{
public:
    static void AddToList(ctl_node& node);
    static void Display();
    static ListIter Begin();
    static ListIter End();
protected:
    DataList();
    ~DataList();
    static DataList& Instance();
    
    ListType nodeList;
};

DataList& DataList::Instance()
{
    static DataList theList;
    return theList;
}

void DataList::AddToList(ctl_node& node)
{
    DataList& theList = Instance();
    
    theList.nodeList.push_back(&node);
}

That's enough to see how the functions would get a reference to the one instance.

Thoughts?

Effect Systems?

In this post, Andreas Rossberg alludes to the idea of an "effect system," which annotates programs with information about what impure operations various components perform. He also says that there has been "quite some research on [effect systems] in the past 15 years." However, I've been unable to find anything interesting with Google. Could someone please point me to some good research on effect systems?

Classic CS Texts

After reading the latest "Classic Papers" discussion, I thought I'd make a short list of important and enjoyable papers. Have a look and tell me if I missed some important ones.

link

Stroustrup talking about C++0x

Found this link on OSNews and thought i might post it here. It is an interview with Stroustrup in which he talks about his new language: C++0x. It is supposed to come out in 2009 and be the next evolution of c++, geared
towards making C++ a better language for systems programming and library building and make C++ easier to teach and learn.

Dataflow languages and hardware - current status and directions

Being interested in dataflow languages and hardware for almost three weeks already I found very little information about those topics. The most interesting was Wavescalar dataflow processor mainly because of recency of the work.

It seem (from Google index) that dataflow programming is somewhat out of vogue.

Anyway, do anyone have any information about dataflow language implementations and hardware support for that computing paradigm?

And why does anything dataflow based seem to be out of mainstream?

The new old or The "Return" to Concurrency

In order to develop a fairly complex pipeline of operations for a content management system I am developing I found myself resorting to the old unix way of doing things: I need to process a large set of data (emails), so I set up a pipeline of coprocesses (with messages between each process relating to some chunk of email on disk)

     cp1 | cp2 | cp3 | cp4 | cp5 .. cp12

While this may seem trivial to most people here, I was struck by how profound this classic (20-30 yr old) approach is. Yes, I know that unix (shell) pipes are limited because they are only unidirectional, but if I followed status quo these days the implementation would have been a monolithic OO app (with cp 1-12 being objects passing messages to each other) or perhaps something more FP (with cp 1-12 being a chain of pure functions calls).

Instead, here we have a truly concurrent solution that will take advantage of multiple CPUs, message passing, and has strict encapsulation -- all in a language neutral architecture.

This came about as an experiment relating to using a severely restricted language (in this case AWK) to implement a fairly complex application. Working under Unix with minimal tools is yielding ways of thinking I haven't considered since my hardcore Unix days in the 80s.

While this may sound like just a simple workflow problem, for my app there is some conditional variability in play where some processing may need to be excluded from the workflow, but that too can be handled by traditional unix piping: if a process has nothing to do to certain data (or is instructed by the previous process not to touch certain data) it is simply passed along (untouched) to the next process.

Nothing mind boggling here, but it did strike me as interesting from a monolithic super language vs small language in a unix environment perspective.

Favourite Use of Embedded Interpreters?

With the silly season in full swing, I thought it would be nice to throw the forums open for members to indulge a bit of light whimsy. For a long time, my favourite programming conceits have been embedded language interpreters. These tools evoke in me the mystery and wonder of bootstrapping, and the seemingly magical ability to overcome the limitations of a computer's hardware instruction set with new facilities implemented in software. And from these devices rise many wonderful things: domain specific languages, the often elegant implementation of these tools on severly constrained platforms, and the pragmatic efficiency tradeoffs required to do so.

Let's hear of your favourites. To kick things off, here are two of mine.

1. The Apollo Guidance Computer

This extract is from 'E-2502 AGC4 Basic Training Manual' (http://www.ibiblio.org/apollo/NARA-SW/E-2052.pdf)

The Apollo Guidance Computer was designed with the idea that its weight, size and power supply were costly items. Mission requirements warrant a hardware compromise of a word length with a minimum of 15 bits and an instruction repertoire of 33 instructions with which to work. The result, therefore, is a small, fairly simple machine with limited abilities. While the AGC hardware provided for manipulation of single- and double-precision quantities, frequent need arose to handle multi-precision quantities, triginomic operations, vector and matrix operations, and extensive scalar operations. Thus, to fulfill the system requirements planned for the lunar missions within the constraints of hardware limitations, it is necessary to employ software to expand the capabilities of the AGC.

[Some discussion on subroutine calling overhead elided.]

Thus to solve the memory wastage problem caused by frequent use of the calling sequences, it is expedient to create an entirely special mnemonic language in which each mnemonic corresponds to a subroutine. Since, in many cases, the new mnemonic instructions require no addresses, we design a packed instruction format which stores two seven-bit operation codes in one word of memory and any required address constants in the two following words.

2. The Apple II SWEET16

Steve Wozniak writes about the development of the Apple II in BYTE magazine, May 1977 (http://oldcomputers.net/byteappleII.html)

The Story of Sweet Sixteen

While writing Apple BASIC, I ran into the problem of manipulating the 16 bit pointer data and its arithmetic in an 8 bit machine.

My solution to this problem of handling 16 bit data, notably pointers, with an 8 bit microprocessor was to implement a non-existent 16 bit processor in software, interpreter fashion, which I refer to as SWEET16.

SWEET16 contains sixteen internal 16 bit registers, actually the first 32 bytes in main memory, labelled RO through R15. RO is defined as the accumulator, Rl5 as the program counter, and R14 as a status register. R13 stores the result of all COMPARE operations for branch testing. The user accesses SWEETl6 with a subroutine call to hexadecimal address F689. Bytes stored after the subroutine call are thereafter interpreted and executed by SWEET16. One of SWEET16's commands returns the user back to 6502 mode, even restoring the original register contents.

Implemented in only 300 bytes of code, SWEET16 has a very simple instruction set tailored to operations such as memory moves and stack manipulation. Most op codes are only one byte long, but since she runs approximately ten times slower than equivalent 6502 code, SWEET16 should be employed only when code is at a premium or execution speed is not. As an example of her usefulness, I have estimated that about l K bytes could be weeded out of my 5 K byte Apple-II BASIC interpreter with no observable performance degradation by selectively applying SWEET16.

I'd love to hear of other gems, particularly those that are embedded to expand platform capabilities, and not so much standalone virtual machines or interpreters.

Realistic Functional Programming in the Java Platform

Greetings,

I need to produce software to run in the Java Virtual Machine, but I have no restriction on which language to use, so I'm not stuck with Java the language.

I thought at first about using Scala or Nice: both support functional programming along with OO and other paradigms, and integrate well with the Java API.

So I'm left to decide which one to use. From a quick overview, Scala seems to have more language "features", but lack some serious editor support (I didn't see any emacs mode, and the Eclipse plugin is not in a good state right now). Nice has an emacs mode but the latest version of the language distribution is somewhat old, which made me think if it's still being actively developed. As I've seen many people around LtU report experiences with these two languages, I'd like some comments about the two of them, comparing them if possible. That would help me to decide.

And I'm willing to consider other options too. My main requirements are: 1) multi-paradigm language with good support for functional programming; 2) good editor support (emacs or eclipse); 3) generates code for the JVM; 4) integrates well with Java, being able to both use Java classes and APIs and generate classes that can be used by Java code. As a bonus, having lexer/parser generating tools would be great. Haskell and OCaml are two of my current favorite languages. I like Scheme too but tend to be a statically-typed guy.

Thanks for any help.

[EDIT: added a fourth requirement that I forgot to mention]

Return of the Global Variables?

This is bugging the heck out of me, so I wondered if anybody else complained about this before and I found somebody.

For the last two hours, under the heading of object-oriented, encapsulated code without global variables, I have been reading spaghetti code. Every class method I look at involves a few private instance variables whose lifetimes are as long as the object itself. Ok, the scope is limited to the class methods only, but for a class of a certain size, how is this any different from global variables?

PS: No, they are not static member fields. In general, I have nothing against singletons.

Joel Spolsky views on CS education

As CS ed. is being debated, Joel Spolsky is out with a new article warning about the perils of an all-Java training.

Tasy pieces:

The recruiters-who-use-grep, by the way, are ridiculed here, and for good reason. I have never met anyone who can do Scheme, Haskell, and C pointers who can't pick up Java in two days, and create better Java code than people with five years of experience in Java, but try explaining that to the average HR drone.
CS is proofs (recursion), algorithms (recursion), languages (lambda calculus), operating systems (pointers), compilers (lambda calculus) -- and so the bottom line is that a JavaSchool that won't teach C and won't teach Scheme is not really teaching computer science, either.
XML feed