The hits and misses of Microsoft

As Bill Gates finally bows out of Microsoft to pursue his charity interests, we look at some of the hits and misses of the software company he founded.

This is the premise of a recent BBC story. Alas, the story does not talk about (and certainly doesn't focus on) programming languages, something that was part of Microsoft practically from day one (i.e., the Altair BASIC). So how would you write the story from the perspective of programming languages (their impact on the success of the company, on the quality of its products, and on the industry as a whole)? No trolls, please!

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

I'll be (somewhat) nice to MS

even though I'm long on record of thinking they are, or have been in the past, a somewhat unethical monopoly that peddles shoddy software.

I'll skip over their roots as a language house (in which they were largely porting existing languages to the 8-bit micros of the time), and look at their first real success as an in-house PL--Visual Basic. While those of us in PLT like to cast stones at VB--there are many parts of its design to dislike--it actually was quite good at what it did (permitting novice programmers to quickly put together simple business applications).

MS' implementing of Windows in C/C++ was a major step in these languages becoming important in industry (MS-DOS was generally implemented in x86 assembly language, and in the 80s the primary HLL on PCs was Pascal and not C/C++).

MS also has long understood that tooling, support, and infrastructure were very important. While COM and its successor technologies (up to and including .Net) aren't terribly novel; MS integrated 'em better than prior attempts to do so. MS position astride the OS certainly helped--as the necessary plumbing came with the OS and was already available. COM was an advantage that MS had over the UNIX world for a long time; and the UNIX world still doesn't have a good equivalent for .Net (excluding clones of it such as Mono).

It will be interesting to see if and how F# goes "mainstream".

On the other side of the coin--MS's attempts to hinder the adoption of cross-platform PLs such as Java and (to lesser extent) Javascript by promulgating intentionally-incompatible versions, was truly reprehensible. MS's numerous extensions and additions to C/C++ are rather annoying to those of us who have tried writing cross-platform code--though here I think MS was merely trying to add missing functionality rather than pee in the well.

So MS has had a lot of practical affect--simply by being the dominant software vendor. Much of their work in PL implementations has been quite good. Theoretically important? MS doesn't wear its research on its sleeve; OTOH, lots of bright fellows work at MS Labs (SPJ and others), so I can't really discount them either.

.Net equivalent on Unix? Java.

Could you explain to me what's the difference between .Net and the Java platform?
I don't get the difference (.Net was so poorly explained by Microsoft that it doesn't help of course).

[ My opinion about Microsoft:
-they have abused their monopoly every way they could
-skimped on software development shipping low reliability, low security software even though these gave them *billions* of benefits
-their motto is 'the computer knows things better than you', which I despise
-they're great at marketing ]

Not all that great...

which is why my original claim doesn't appear to be a significant deficiency in the Unix world. (Or of long-lost stepchildren, such as the Mac these dayss...)

The main differences I see between .Net/CLI and the Java world are:

* greater platform independence for the latter
* IR for the latter (JVM) is much more strongly optimized as a target for the Java language; whereas CLI is more effectively language-neutral. (Though both present issues for certain types of languages). Things which aren't Java, or which don't look like Java, are a bit harder to translate to Java bytecode.
* The CLI does a few things "better" than the JVM--the two stacks' method of implementing generics, for instance, has been a source of consideral flamage.

But like I said--not much. I'm still waiting for a high-level language ecosystem which is both effectively vendor/platform neutral, *and* language-neutral. I think that JVM will get there first, but it's not quite there yet.

Microsoft and languages

Just to start things off, here are some points that are industry "truisms" (i.e. I don't take credit for any of these ideas).

Because of its size and influence, Microsoft can single-handedly invent a programming language and get a large community of users, without any need to go through any standardization process or co-operate with the rest of the industry. Visual Basic is a major example; so is C#. There are many software developers out there who look entirely to Microsoft for guidance about what tools to use in order to accomplish their goals.

Microsoft's adventure with Java was very interesting. For those who don't remember, Microsoft implemented their own Java. They added a feature to allow you to call out to various Microsoft libraries, and I think they also added language enhancements. Sun, and much of the rest of the industry, saw this as an attempt to get Java programmers locked into Microsoft, rather than writing "write-once, run-everywhere" portable code. The Microsoft engineers, in my opinion, probably thought they were just doing good things for the users; what their "corporate masters" had in mind is harder to say. Eventually Sun forced Microsoft to withdraw their Java. Microsoft responded by creating their own language that was (in my opinion, anyway) clearly based on the idea of Java, namely C#.

The existence of C# has been good for the users of Java, since it motivated Sun to match C#'s new features and innovations.

Note on speed

Just to add a bit of context to Daniel's comment above.

Netscape's java bytecode interpreter was terribly slow. At this point Java was being looked at to offer basic interactive functionality, like menus that adjusted to context, or problems slightly more complicated than CGIs. Essentially an applet language not an application language. Microsoft by having their Java in Internet Explorer use explorer / windows features directly allowed for Java applets to execute many times faster then did Netscape's Java. It was also more feature rich. What it wasn't was platform independent.

So the real question was: did an applet / web language need to be platform independent? Sun answered yes, and Microsoft answered no. And platform independence turned out to be key for enterprise software....

I'm not sure what would have happened if Microsoft Java/J++ had been allowed to grow. I think the web would have moved in the direction Flash is taking it today 10 years earlier. Java would have never gone enterprise and thus the software community never would have developed this strong desire for binary platform independence where we have Java apps that almost work across systems.

I doubt it

The reason the whole AJAX thing works today is a) pipes are fatter, and b) computers are faster and have more memory. Javascript is no faster than Java, other than the JVM startup time issue--and is frequently slower. Likewise, some of the Flash animations you see today would not have been possible on stock hardware a decade ago--and even if the computer were up to snuff, the 56k dialup connection that provided Internet access (assuming a home machine) certainly was not.

AJAX and Flash

Scott --

Where did I say anything about AJAX? AJAX while a competitor to Flash is very different in how it executes. AJAX would not have been possible 10 years ago. Flash would have (not at the same graphics quality as today but 10 years ago expectation were lower).

As for dialup users back when Java started you got a message about a download and waited if you wanted content. The apps saved locally and it was a one time cost.

Haskell

No discussion about Microsoft and languages on LtU would be complete without mentioning Microsoft Research, especially in its funding of Haskell researchers. Definitely a hit. See also F#.

MS Integration

The integration mentioned above cannot be underestimated. For the practical programmer, it is important, when building sophisticated software, to be able to focus on the 'novel' bit, while pulling in all the other `sophistication' with a framework or tools. Java's rise is probably mainly due to the framework. It makes a big difference to not have to hunt around for libraries to do this and that, to have them handed on a platter without the requirement to evaluate the trade-offs of multiple competing (and possibly partial) implementations of some functionality that you do not have the time to do yourself.

Being the size of Microsoft does make it easier to create such programming environments, even if the quality tends to be tainted by the necessary compromise that comes with size.

But, if there be little other contribution, MS clearly shows the importance of `integration'.

Java did the integration earlier, though. But GNU should be recognized, too, which is a kind of framework for C (lots of available libraries and tools). What you tend to get with Microsoft, though, is removal of the 'burdon of choice', which can be a big plus (for 90% of your software you won't care which x-library you use).

Other contributions of MS are 'second system' effects, like .NET multi-language integration that was terrible with CORBA, SQL integration with LINQ into the language rather than as another preprocessing layer, etc. Innovators often do not get a chance to do a 'second system'.

GW-Basic

I think we have Microsoft credit / blame for Basic becoming the lingua franca in the 80's and many of the ideas of basic existing to today. They pushed basic hard on the CP/M machines and onto the PC. Along with Apple Basic this meant everyone knew Basic.

To this day: we still have basic's for/next, while/until being the primary loop structures rather than recursion. And because of this loose procedural is still the dominant paradigm.

With Qbasic they taught everyone how to mix loose Pascal structure into Basic, and language like C ended up winning against strict procedural languages like Modula-2. With Visual Basic they taught a whole new generation how to have a sorta object oriented language without buying into the whole paradigm. Basically the mixed structure languages like C++ and Perl owe the success of their ideology to people having learned basic as a first language.

I wouldn't blame BASIC for that...

The BASIC dialects of the 80s encouraged goto-infested spaghetti code. C and its descendents have iterative loop controls; a legacy they most certainly do not inherit from BASIC, but from Algol, PL/I, etc.

Basic was rather late to the structured programming party.

Basic Time line

Scott --

Again read what I wrot.e. This wasn't about the evolution of languages but the evolution of programmers. As for your dates I'd disagree as well. The for loop and gosub (subroutines) were part of the CP/M basics from the 70s and by 85 you had QuickBasic with full structure. Subroutines via. Gosubs were part of the CP/M basics going all the way back to the Tiny Basic (75). Microsoft adopts them in 77 with PET BASIC and thus the first popular IBM-PC Basic, BasicA later GW-Basic, ships with those in 1981.

There was never a period in Personal computing history where the major control structure was the GOTO. I'm not going to say people didn't use GOTOs in the 70s and 80s:
IF a THEN GOTO bcd ELSE GOTO efg
were common but often where code entered was commented and the line numbers provided structure:

11000 REM Start Screen Printing
....
11150 REM End Screen Printing
12000 REM Start Data Clean
....
12380 END Data Clean

Except for variable passing via. global variables how much different is that really from the kinds of loose code you saw in the C of the late 80's and early 90s? And those C programs had lots of global variables as well.

My memory from that era is hazy....

...and my 80s programming experience was as a youth.

My comments apply to BASIC only here; obviously industrial programmers using a Wirth language, Ada, C, Algol, Pl/1 or any of the modern procedural programming languages during the late 70s or early 80s were doing structural programming--the structural programming war of the early 70s were FTMP over, and GOTO had lost.

But the 8-bit micro scene on machines like Atari 400/800s, Commodore machines (PETs, Vic-20s, and C64/128s, Apple II and derivatives), and other home computers from vendors like TI, Amstrad, and whoever else--generally featured BASIC as the main high-level language for introductory programming. (Serious programming on such machines was generally done in assembly language; as the BASIC interpreters of the time were all too slow for quality production code). And the BASIC dialects of the time generally had a flat, non-reentrant variable namespace. They did have subroutines via GOSUB, but these were hard to use re-entrantly, as there was no stack frame containing local variables. A variable declared in a subroutine was still available outside; and nested subroutine calls could clobber the "locals" of their parent invokations.

Compare this to the main procedural languages, all of which had re-entrant subroutines.

Virtually all of the BASIC code I encountered (and much of what I wrote, not having any professional or academic instruction at the time) was littered with GOTOs. Again, I'm speaking of the home computer scene, not industrial or academic practice.

Gosubs

Scott --

Oh I was talking about those days. Pick up one of the early 80s computer magazines you'll see the structures I was talking about.

As for flat memory that's why long variable names were so useful.
If you are talking about the 70s well lets take an example Apple Basic from 76 (manual) here you only one digit. But with the next version Apple ][ you had unlimited.

You would do something like this for variables:
500 REM A-E reserved for code blocks:

note the 51 is reserved because of the 5100

5100 REM START array graphics routine
....
5171 FOR I51 = 5 TO 35 STEP 2
5172 LET C = C51
5173 GOSUB 8000
5174 LET A51 = A
5175 ....
5178 NEXT
....

That's what compilers often do with name mangling anyway. You just had to do it by hand. Don't get me wrong I'm not saying things are better today. But when you talk about the PET you are talking about a computer with 4k of RAM.

That coding style...

predates BASIC by a nearly a decade (edited by author). (Think FORTRAN).

At any rate, one can certainly do structured programming in BASIC; my first paying programming job (as an intern) was writing accounting software in BASIC, actually--not Visual BASIC, but old-style BASIC of the sort above. (Legacy system, naturally; this was over fifteen years ago). And being conscientious modern programmers that we were, we did use a procedural style in any new code we wrote, much as you do above.

But...there were always the exceptional case. The goto-from-nowhere which would throw a wrench in the works, and violate the guarantees that a pure structural style gives you. The occasionally-lazy programmer, who ignored the rules in order to get something done by a deadline. The silent dependency on that particulars of a loop by some code in another module that, in any rational decomposition of the functionality, has no business existing. The clever "guru" trying to shave a few bytes here and there.

Indeed--methinks one of the first examples of greenspunning^H^H^H^H^H^H^H^H^H^H^H^H^Hdesign patterns is the by-hand implementation of structured programming on top of early varieties of BASIC, FORTRAN, and COBOL--all of which depended on GOTO as a primary means of flow control. (And has been noted by Paul Graham, Peter Norvig, and other critics of patterns; patterns are often an indicator of a missing language abstraction). And compared to the coding style you illustrate so well, where the poor programmer is essentially having to add ad-hoc de Brujn indices by hand, even the crude two-layer scoping of C is a godsend. :)

(Any chance that Drupal can be modified so the <strike> tag works? It looks a lot nicer than ^H^H^H^H^H^...)

Design Patters

I agree with the comment regarding design patters. Anyway seems like we are on the same page at this point regarding BASIC. And yes you are absolutely correct about small violations destroying the mechanism. That's why I like Haskell remove the mechanism for impurity and
you vastly increase code quality.

Of course, some programmers are still fond of "goto"

Nowadays, they just use the more politically correct term "continuation".

(ducking)

:)

You *better* duck!

Lucky you ducked, because I'm afraid I can't resist responding to such a technically inaccurate joke.

Continuations are more like an inverse COME FROM (cf. INTERCAL).

GOTO allows you to jump to any suitably reachable, statically labelled location. For practical reasons, languages that support continuations don't allow you to construct a continuation out of thin air and use it to jump to such arbitrary locations, even if you've statically labelled the target location somehow.

Intead, you need a dynamic context (a "call stack") that's appropriate for the location you want to jump to. Languages with continuations require that this context be created by actually executing to the point you're going to want to jump to, and then saving the dynamic context at that point for later use. So you can't jump to anywhere you haven't already been to and planned to jump to, by dynamically saving the context.

Because of this requirement, which ensures the integrity of the environment, continuations are to unrestricted GOTO what garbage-collected memory management is to manual pointer-based memory allocation; i.e. the former is safe, the latter is unsafe.

Back to INTERCAL: if one imagines that INTERCAL allowed you to save the context needed to implement the target half of a COME FROM, then when you're at the point at which you're going to want to jump to, you'd save that half of the COME FROM context. Later on, somewhere else in the code, you'd invoke the saved half-COME FROM, thus jumping to the location where the COME FROM information was saved.

Put like that, it's hard to see why anyone would find continuations confusing!

Quack!

Whoops, wrong duck. :)

A nicer way to put it is that continuations are smarter labels; and languages with continuations don't syntactically distinguish between an ordinary function call and a transfer to a continuation; whereas all of the other languages mentioned in this post do.

Ignoring assembly language, the languages I am familiar with that support (static) GOTO do ensure that the target of the goto has a proper environment, either by restricting the set of statements which may be goto targets, or by restricting the proliferation of environments.

Early dialects of BASIC and FORTRAN only had a single, global environment which was always valid no matter where you go to. (In BASIC, this environment would be constructed piecemeal; and if you reference a variable before it is first DIM'd, you'd typically get a runtime error; but there was no lexical scoping. (Statements made about BASIC, of course, need to be made carefully, given the proliferation of dialects out there, especially in the early years).

IIRC, FORTRAN I lacked procedures completely; FORTRAN II had them, but a procedures local variables were still part of the global enviornment, just not visible outside. My memory of this might be hazy, though--'twas well before I was born. :)

C/C++ only allow intra-procedure jumps, either within the same static scope, or to an enclosing scope within the same procedure. In C++, where variable declarations need not be placed before statements within a block, goto may not cross a declaration. The other means for non-local flow control in C/C++ (longjmp and exceptions) are limited to invoking enclosing contexts (in this case, dynmaic rather than lexical). Pascal permits gotos only to the current scope or any lexically enclosing scope, a restriction which (coupled with the ban on function return values) also ensures environmental integrity.

Continuations, in a sense, are another way of making sure that a branch target has a valid environment, thus increasing the number of places to where a nonlocal transfer control may be made--rather than limiting nonlocal jumps to places where simple static analysis of the code could prove this condition to be true. (As an aside--are there any languages out there, in particular ones with significant bodies of code, which permit jumping into a scope where a valid environment doesn't exist, or isn't manufactured upon entry?)

Of course, many of the arguments (though not all of 'em) made against GOTO statements during the structured programming wars, apply to continuations as well--and that's disregarding the issue of environmental integrity. Both of 'em can interfere with high-level stack discipline, and can make it more difficult to reason about a program's behavior. Of course, sometimes you WANT to eschew stack discipline, which is when nonlocal, nonstructured control transfers are useful. But still--while the "continuations are gotos" canard is technically inaccurate as you so elegantly point out (which is why I knew to duck...), the existence of non-local transfers in a program makes it harder to understand--especially when those transfers occur in a fashion which bypass normal abstraction boundaries.

SML/NJ

A nicer way to put it is that continuations are smarter labels; and languages with continuations don't syntactically distinguish between an ordinary function call and a transfer to a continuation; whereas all of the other languages mentioned in this post do.

This is actually untrue. SML/NJ supports continuations but uses a different syntax for invoking a continuation. You can, if you like, wrap all the continuation primitives up in a functional interface.

Call/cc breaks abstraction

General call/cc also breaks abstraction safety, because it enables you to duplicate arbitrary control flow. It is the control equivalent of a generic clone (deep copy) operator on data structures.

Goto in C/C++

PS.

Scott Johnson: C/C++ only allow intra-procedure jumps, either within the same static scope, or to an enclosing scope within the same procedure.

I don't think this is correct. C/C++ allows you to jump arbitrarily between blocks of the same function (modulo the restriction in C++ that you may not bypass initialization). This is still safe, however.

...Duck... Goose!

A nicer way to put it is that continuations are smarter labels;

With the caveat that you can't jump to a location unless you've already visited it and dynamically prepared for a later jump.

(I should also emphasize that we're discussing the explicit use of first-class continuations, as opposed to the general concept of continuations, which is more broadly useful than our current discussion might seem to imply.)

and languages with continuations don't syntactically distinguish between an ordinary function call and a transfer to a continuation;

Derek already pointed out that SML/NJ does distinguish. I'll add that the way continuations are expressed as procedures in Scheme can be considered a kind of pun. At the semantic level, procedures are invoked with a continuation argument, whereas continuations aren't (or, if you decide to try push the equivalence down to the semantic level, then continuations have to ignore and discard the continuation they're invoked with).

IOW, although the runtime representation of a procedure has an environment, that environment doesn't include the procedure's specific continuation - otherwise calling procedures from more than one place in a program would be a problem. A continuation, otoh, *is* what a procedure needs to know where to pass control to when it's done. So in some ways, conflating the two is really confusing.

(As an aside--are there any languages out there, in particular ones with significant bodies of code, which permit jumping into a scope where a valid environment doesn't exist, or isn't manufactured upon entry?)

Assemblers and machine languages - the grandaddy of the GOTO. Of course, what a "valid environment" means in that context is up to the program, so in a strict sense the question doesn't apply.

the existence of non-local transfers in a program makes it harder to understand--especially when those transfers occur in a fashion which bypass normal abstraction boundaries.

That depends on how they're used. Many of the classic uses of continuations (first-class or implicit) make it easier to reason about programs: threads, exceptions, etc.

As an aside...

what is the reason that Scheme only manufactures continuations as part of call/cc--an operation which requires a function that receives the continuation as an argument?

Given that the argument of call/cc is often times just a lambda that saves the continuation away in a local variable--why not just have a "cc" special form that simply generates and returns the continuation, which the programmer can do with what s/he wishes?

LtU variable binder

Because lambda is the ultimate variable binder, at least in Scheme.

The goal is to bind a variable to a continuation so that it can be referenced later. Special binding forms in Scheme are traditionally just syntactic sugar for uses of LAMBDA.

Call/cc is definitely intended as a primitive whose use is supposed to be encapsulated in procedures and/or macros, which is one reason its official name (up until R6RS) was call-with-current-continuation.

letcc

It is possible to define a letcc binding construct, and use it in place of call/cc, of course.

Lambda, the Ultimate GOTO

Given that the argument of call/cc is often times just a lambda that saves the continuation away in a local variable--why not just have a "cc" special form that simply generates and returns the continuation, which the programmer can do with what s/he wishes?

That's an easy question. If call/cc returned the current continuation to the caller, to what location would that continuation return a value to when it is invoked? (A useful location, that is...)

And yes, real programmers still use GOTOs. They just go by the politically correct term: tail calls.

The BASIC dialects of the

The BASIC dialects of the 80s encouraged goto-infested spaghetti code.

Probably more like the BASICs from the 70s, rather than the 80's. The BBC released their computer along with its version of BASIC in 1981. As well as the old style GOSUB and GOTO (and computed variants), it also supported named procedures and functions with local variables and recursion. In particular, the call stack grew downwards from just below the bottom of the video memory. The highest byte of the video memory was at a fixed place. This meant that you could not change video mode from within a function or procedure.

When the BBC Master was released in 1986, GOTO and GOSUB took another hit because the Master's built in text editor did not keep track of line numbers.

Recursion vs some Iteration constructs

If I use my own historical development as a guide, I would argue that the success of iteration constructs in procedural languages over doing everything with recursion is mostly a result of (a) the fact that this thinking is much closer to machine instruction sets (JUMP and JUMP on condition), and (b) that this must be conceptually simpler, since algorithms were probably always described (before computers) in those terms. (It is difficult to visualise an algorithm for long division to be written in terms of recursion, and I was not taught it in those terms in school). While a lisp system would have certainly been possible on the earliest micro computers (near 1980), i do not recall these being widely available or known. All that you had was machine code or basic. Perhaps the realisation that JUMP and tail recursion are similar arrived too late to have sufficient impact.

Many early languages

did not support re-entrant functions; early dialects of Fortran being one example. Some languages popular on the 8-bit micros of the late 70s and early 80s (for example, ACTION!, a PL/1-ish block structured procedural language for the Atari 8-bit) also didn't have re-entrant functions; no doubt due to the limited stack available for the 6502 architecture (256 bytes). ACTION! didn't support recursion anyway, as the procedure call graph was constrained to be a DAG via means of a one-pass compiler with no support for forward declarations... but you get the picture.

Manual garbage collection

Scott mentioned one problem below the other is memory. Think of a simple recursive routine like:

sum2 x = _sum2 0 x
where
_sum2 a (b:bs) = _sum2 (a+b) bs
_sum2 a [] = a

Now do it with manual garbage collection, you can't, at least not reasonably. So you naturally want to use a loop and having a running total. Which is what the compiler does anyway. So far not to bad but...

sum3 x = _sum3 0 x
where
_sum3 a bz@(b:bs) = if b < 10
then _sum3 (a+b) bs
else (a,bz)
_sum3 a [] = (a, [])

requires better memory handling. Otherwise the programmer needs to handle these two situations in two totally different ways from a code perspective.

That's why in C for example it would make sense in both cases to just move the array pointer, so they look similar they way they do in the Haskell above.

Recursion and history

[iteration] must be conceptually simpler [than recursion], since algorithms were probably always described (before computers) in those [iterative] terms

Nope. Recursion has a deeper and longer history than computers. See all the work on set theory, for instance. Closely related is the inductive form of proofs.

There is perhaps some sense in which iteration is conceptually simpler, but I have a hard time seeing it. I can casually rattle off a functional, recursive quicksort in many languages but I really have to think VERY hard to get an iterative, in place version right.

It is difficult to visualise an algorithm for long division to be written in terms of recursion, and I was not taught it in those terms in school

Hmmm...my recollection of long division is a recursive algorithm. Are we talking about the same thing? To do long division you find a digit, multiply, find the remainder, then do a (recursive!) long division on the remainder. Base case depends on integer vs decimal math. An accumulator for digits lets you make it tail recursive.

While a lisp system would have certainly been possible on the earliest micro computers (near 1980), i do not recall these being widely available or known

The very earliest microcomputers barely had enough memory or horsepower to let you sneeze. But shortly thereafter, microcomputers got plenty big enough to deal with C and Pascal. They handled recursion just fine. The issue on those small boxen was that they didn't have the memory or horsepower to deal with garbage collection for any decently sized program. Without garbage collection, you don't do Lisp or Scheme or ML or any of the other fine functional languages that existed at the time.

See Small-C.

Products

So is there a time line somewhere of MS's language products?