Open thread: RIP Steve Jobs

Steve Jobs (1955 - 2011) had a profound influence on the computing world. As others discuss his many contributions and accomplishments, I think it is appropriate that we discuss how these affected programming, and consequently programming languages. Bringing to life some of the ideas of the Mother of All Demos, Jobs had a hand in making event loops standard programming fare, and was there when Apple and NeXT pushed languages such as Objective-C and Dylan and various software frameworks, and decided to cease supporting others. Some of these were more successful than others, and I am sure members have views on their technical merits. This thread is for discussing Jobs -- from the perspective of programming languages and technologies.

Update:

Eric Schmidt on Jobs and OOP

Stephen Wolfram on Jobs and Mathematica

The iPhone mandate decision

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

My favorite SJ quotes below.

My favorite SJ quotes below.

"Your time is limited, so don't waste it living someone else's life. Don't be trapped by dogma -- which is living with the results of other people's thinking. Don't let the noise of others' opinions drown out your own inner voice."

"Here's to the crazy ones, the misfits, the rebels, the troublemakers, the round pegs in the square holes... the ones who see things differently -- they're not fond of rules... You can quote them, disagree with them, glorify or vilify them, but the only thing you can't do is ignore them because they change things... they push the human race forward, and while some may see them as the crazy ones, we see genius, because the ones who are crazy enough to think that they can change the world, are the ones who do."

And the connection...

And the connection...

He has been hugely

He has been hugely inspirational to technology in general, he was not a technical genius but understood something fundamental about innovation and its connection to human nature and society. None of that is specific to PL but we can definitely learn from it and harness it.

This has been a very emotional morning for me.

Well, I think he WAS a technical genius....

I see the comment, "he was not a technical genius but...."

I only met him once, but I knew a lot of his comtemporaries. Bob Noyce, Gordon Moore, Steve Wozniak, Allen Baum, a bunch of others.

Jobs was by all indications as deeply immersed in technology as nearly anyone of these others I ever met. Maybe he didn't know submicron lithography as well as some of his employees, maybe he didn't know injection molding as well as some others, maybe he didn't know Objective-C as some others, but he surely knew a lot about a lot.

I would certainly class Jobs as one of the geniuses I have ever met or worked with. And I was lucky enough to work with or meet a bunch of them. I was more in the physics, chips, materials side that in the Comp. Sci, side, even though CS is more my current interest. But I have a pretty fair idea of who the really, really bright people are. And Steve Jobs was right up there there with Feynman, Moore, etc. In a different way, as they all are, of course.

--Tim May

A genius need not be a

A genius need not be a technical one.

The lesson of Jobs/Apple may

The lesson of Jobs/Apple may be ambivalent. You counter inertia and network effects by producing your own lock-ins. This is not for everyone and I do think its harmful with respect to programming. Apple was surely not innovative and set measures here, but extremely cautious and restrictive and Jobs not a developers, developers, ... madman. The idea that restrictions on programming tools are already some sort of quality control is a flawed one and an example of managerial thinking. However, if there is some element of the "Apple way" I expect to be imitated in a cargo-cultish fashion by companies other than the heroism of the genius leader, than this.

One also has to be careful when a control-freak openly admires "the crazy ones, the misfits, the rebels, the troublemakers, the round pegs in the square holes..." and there is no possible subject in his universe that fills these roles other than himself. Of course Jobs is also a good argument for capitalism, for the self-determined entrepreneur, who wants to realize his vision and becomes super rich and successful and all that. I suspect it won't help much anymore. It has become too much of an exception and I guess the majority of young guys don't buy the whole idolatry around the Apple cult and his dead leader, despite loving the Macs, but I may be wrong.

You counter inertia with

You counter inertia with products people love, but you make compromises necessary to get those great experiences. One of the compromises that most of us hate is the locked down platform, but it has done wonders to protect against viruses and reduce piracy (so developers can get paid and make those apps consumers love so much). Yes, I hate that the most interesting platform out there is closed to us, but I totally understand why they did that. Rather than condemn apple, why not think of solutions that can make all of us happy?

A person appears crazy if they hold a different world view from the norm, they aren't constrained by the same inhibitions and limitations. Often crazy is a negative...consider flat earthers, creationists, tea partiers. But sometimes that other guy is crazy in a good way, they believe in something great and work to achieve it, something that most don't believe in.

There was never an apple cult, people buy macs and iPhones because they are the best you can get these days. If something comes out tomorrow from Sony or samsung that is better, they will switch quickly, brand loyalty is weak in tech. The apple cult myth was invented by lazy people trying to make excuses for their inferior solutions...that consumers are just irrational sheep. As a researcher, never use irrationality to explain trends, you will almost always be wrong.

There was never an apple

There was never an apple cult, people buy macs and iPhones because they are the best you can get these days.

One author of the online version of the German news magazine The SPIEGEL calls Jobs the iGod and another one seriously considers him as the greatest philosopher of our time. It's outright bizarre that this happens to a magazine that was once indebted to enlightenment and critical thinking and infamous for its sarcasm. I do not have to look for "irrational sheep" among consumers and their will to gadget, since I find them among obsequious "journalists" and other opinion leaders. Neither me nor paid Apple enemies ( there are no other for sure ) have invented them.

Journalists writes

Journalists writes sensational stories to attract eye balls. Nothing new about that, it doesn't say much about society or real people.

I'm not really sure why Kay

I'm not really sure why Kay has such negative feelings about Steve Jobs, but I have a feeling that the same phenomenon applies to some posts here.

Well...

...someone banning new programming languages from their platform just might tend to create negative impressions of them among people interested in programming languages. It's a crazy theory, I know. ;)

Live in denial like I do. :)

Live in denial like I do. :) Seriously tho, they can't stop you from developing for the platform, just selling your apps through their store. Given how unprofitable that endeavor is for about 95% of all iPhone developers, they might be doing you a favor. And, it won't work.

In the natural order of

In the natural order of priorities the mainstream media reports about auctions for Steve's black turtleneck ( 175$ ) for people who want to look like him whereas the death of Dennis Ritchie makes it into some online tech magazines.

People will tend to be

People will tend to be interested in (or mourn) someone's death if they have a connection with them. This is not disrespect, actually, at that point it is hardly important to those who are really impacted (e.g., family) if one is or is not the top news story.

After I go, I could care less who cares.

overstated

There really aren't that many restrictions on programming tools.

The developer SDK is $99. That eliminates the big restriction about installing software. Use whatever tools you want and create provisioning files for yourself.

For people who like the iPhone/iPad and don't like Apple's restrictions the Enterprise SDK is $299. By it, setup the servers and provide support services to those devices yourself.

Yes, the Apple environment is more locked in. But it is far less restrictive than what is true for feature phones via. most carriers.

I cannot build a programming

I cannot build a programming environment for programming ON the iPad. That makes me sad.

On the iPad

Let me start by saying, in practice though I don't any people who use their blackberry or android phones to compile software. I think the issue is more theoretical than practical. My car's breaking system has all sorts of computer chips and I can't modify that software either. Apple provides a less restrictive environment for their tools than most other consumer devices.

iOS devices are not self contained. Part of the security protocol is that the operating system is not self modifying. Ultimately that's how malware and viruses are stopped dead in their tracks. The ability for code to create and run other code is a huge negative for the vast majority of end users, having this turned off by default is probably a good move. But Apple makes it possible and quite easy to avoid if you are someone who is going to benefit from the ability to code.

For example assuming you are involved with a university you could use the university SDK, which would be free for you. From that you easily could create and load a programming environment for yourself on the iPhone/iPad, an interpreter. And you could distribute that to a small number of people who know, even some who aren't themselves involved in a university. The only thing Apple won't allow you to distribute it widely.

Taking it even further, using the University SDK you could create a compiler and linker (again not widely distributed), on devices managed by Apple. What you wouldn't be able to do is run the code from the compile / link cycle. And that's because if they let you actually run the code, then you have something that is structurally no different than a virus.

And again what I said above still applies. If you don't like Apple's policies, you can just point at different support servers which have different policies and then do whatever you want. Apple makes the software to setup these support servers cheaply available ($299). There are corporations that are self supporting using the Enterprise SDK.

Absolutely it is the case that iOS is a more restrictive computing environment than people are used to for general computers. Apple is trying to create an environment where people get the advantages of managed (like my car's breaking system) and the advantages of unmanaged (like Windows). They want to avoid the disadvantages of managed (little software) and the disadvantages of unmanaged (computers that don't work right from the end user's perspective). The restrictions are complex, but I don't see the advantage in overstating the case regarding restrictions.

My car's breaking system has

My car's breaking system has all sorts of computer chips and I can't modify that software either.

Your brakes aren't general purposes computing devices. Tablets and phones are so ubiquitous they present a compelling educational platform. Intentionally crippling that platform so the only subject you can't teach is programming is not a good thing.

Educational programming on iPad

Not my intention to blow on the flames, but you can certainly use PythonMath, Lua Console or Javascript-1 on an iPad for educational purposes. They are all programming environments for iOS.

As far as I can tell,

I think you're reading that too literally

They're interested in outlawing an interpretive layer in ordinary apps, ostensibly because native code makes for a faster, lower power, better looking experience. If your app is an interpreter, I doubt they're going to ban it for interpreting code.

in practice

They have banned interpreters, for example the Commodore 64 emulator -- and this was not a license problem license was fully legit. That wasn't an experience issue, they didn't like the introducing of an entirely new licensing model.

Gambit REPL (an iPhone Scheme) is a clear cut example of an interpreter sold as an interpreter that they are allowing, but they are keeping their heads down. The other hand the iPhone calculator I use (ND1) has 3 interpreters in it. I'd be you dimes to dollars though if people starting releasing content for the iPhone in RPL and recommended ND1 as the interpretive layer ND1 would get pulled from the application store despite the fact its primary purpose is a calculator not a programming language.

Basically my point above was that Scratch was in clear violation of the SDK license and what happened to Scratch had nothing to do with deeper or more sinister idealogical intentions. I think we should discuss the actual policies that exist, not make up ones that don't; and do in a charitable way.

Collateral damage. It

Collateral damage. It doesn't matter what their intentions were, the results still exist. Yes, they didn't mean it.

teaching programming

I can't think of anything that would stop you from teaching programming.

There is Gambit Scheme, there is miSoft's basic, gscript basic runs inside Safari... I could probably find 100 languages without much trouble today. Squeak is making through the approvals process.

Moreover if schools want to provide iPads then they should be using the Enterprise SDK in which case Apple's restrictions don't apply at all. They can install whatever software they want.

____

But really the issue as I see it "general purpose computing devices". I think this is the core argument. I see the iPad/iPod/iPhone as secondary devices designed for households that already have a computer (and generally more than one). They don't even come with a keyboard.

Given that Apple (to the best of my knowledge) has
a) never sold them in this capacity
b) up until recently made computer ownership mandatory to even get past the setup screen on them
c) doesn't provide support for them as primary devices
d) continues to have their own software (numbers, keynote, pages, logic...) have mini versions for these devices

What evidence do you have that they are intended as general purpose computing devices?

It doesn't matter what

It doesn't matter what Apple's intent is with regards to the device. The fact is that it is a general purpose computing device. This cuts to the heart of a common conflict between companies that seek to dictate what their users can do with devices they purchase and thus own, vs. users who don't want to be so restricted. Now this conflict has touched on programming languages, making it relevant to LtU.

And sure, you can find any number of examples that passed inspection, but Apple can easily ban those examples at any future date if they feel like it and cite the ToS. This is analogous to governments passing legislation that violate civil liberties, and then saying, "oh, but we'll only violate civil liberties to catch bad guys". Who are the bad guys? Whoever the government/Apple says they are. It's a bad idea, no matter how you cut it.

It does matter

-- The fact is that it is a general purpose computing device.

You keep asserting this and I've asked for some evidence. What makes this no keyboard, designed to be externally managed, locked down operating system... device general purpose?

As a user there are no restrictions on what you do with your iPad/iPhone/IPod. Get the developer SDK and load whatever you want on it. The rhetoric you are using is simply false, the level of restrictions you are claiming are in place simply are not.

I agree Apple has passed draconian rules, and sometimes fails to enforce thus creating a situation where all software they distribute is at their whim. Absolutely Apple can refuse to distribute any piece of software. But Apple has made alternatives available. In particular, you keep ignoring the fact that Apple sells the enterprise SDK. I'm not kidding here, you can setup your own servers and enforce or not enforce whatever rules you want for you and your friends. You don't like Apple's rules make your own.

To use your analogy... An Apple managed iPad is like living in a dictatorial city in a free country., I understand why people may not like a dictatorship, even one with a rather wide open door policy, in theory. But in practice the alternative, for most end users has been a miserable anarchy. Anarchy is government by bandits. If you don't like Apple city there are dozens of other cities.

The reason that Apple has such a strong hold is that it doesn't allow exactly what happened with Scratch.

The claim is that Scratch is going to be a major software transport and delivery platform for millions of children to exchange software.
a) MIT labs didn't sponsor or oversee the effort.
b) No one is doing any kinds of quality control on Scratch applications
c) Essentially none of these applications complied with Apple's HIG.
d) There was no sort of quality control.

.... If Scratch really should be the programming choice for hundreds of thousands or millions of children then: there should be things like a fulltime (probably 16x5 at least) support staff, a telephone number so that when some 8 year old calls Apple support with a problem they have someone to connect to. There need to be people who are active in Darwin making sure that this application is going to be fully functional from OS version to OS version so that when Apple releases an OS patch they don't get calls from 8 year olds mad that their Scratch apps broke.

If Scratch wants to be a programming choice for dozens of children then don't use the app store and just distribute directly to them.

You keep asserting this and

You keep asserting this and I've asked for some evidence. What makes this no keyboard, designed to be externally managed, locked down operating system... device general purpose?

I disagree. Many people will never connect it to their computer, and use it purely as a standalone device.

The reason it's a general purpose device should be obvious: there is no technical way for Apple to enforce its policy. That's why it resorts to manual review, and why so many policy violations squeak through. How is that not a general purpose device?

Finally, there's an invalid assumption in your objection that any general purpose computing device must resemble our current general purpose computing devices. Just because the iPad does not have a keyboard and lacks some of the features found on desktops does not make it any less general purpose.

the level of restrictions you are claiming are in place simply are not.

I agree that the technical barriers are easily circumvented. It's a question of legality and attitude towards customers. Sure you can bypass the restrictions and void your warranty, but why should it? I have no opinion on Scratch specifically, it's just the general attitude I find problematic.

When you have schools giving their students iPads, it seems like a bad idea to have a blanket policy that programming on your iPad is forbidden. And for all you know, programming on the iPad could lead to a revolution in visual programming languages, and the only thing holding that back is this draconion policy.

Furthermore, educational software is not that lucrative that they can afford the enterprise SDK and the cost of managing their own servers. Why should 'free' be penalized? There are applications that don't care about Apple's opinion of quality control, so why should they and the people who want to use that software be penalized because they don't share Apple's vision?

There are many legitimate objections to Apple's policy. I understand the reasons behind the policy, but it's also the reason I switched to Android.

Apple policy

When you have schools giving their students iPads, it seems like a bad idea to have a blanket policy that programming on your iPad is forbidden.

Read the article you linked to. The school is handling the OS upgrades, i.e. these are managed iPads on the Enterprise SDK. The school sets whatever blanket policies it wants.

The reason it's a general purpose device should be obvious: there is no technical way for Apple to enforce its policy.

Actually, there are many ways. Don't forget many of those systems are on carrier contracts, the carriers enforce manufacturer policies all the time. You don't do what you are supposed to and the device stops being able to call and/or get data. Further, up until a year ago you couldn't get the device out of startup mode without a desktop computer.

At this point that desktop management system is being optionally replaced by Apple servers but it is still absolutely obligatory from the device's perspectives. So yes they can enforce the policy technically easily. And moreover for all practical purposes still do enforce this policy.

Finally, there's an invalid assumption in your objection that any general purpose computing device must resemble our current general purpose computing devices. Just because the iPad does not have a keyboard and lacks some of the features found on desktops does not make it any less general purpose.

Actually that is precisely what makes it less general purpose. Being less general purpose by definition means being less suitable for a wide range of activities. These devices are designed from the bottom up to be excellent at a narrow range of activities. That is in fact precisely why Apple was successful with tablets after everyone else had a decade of failure. Apple did not make their tablet general purpose. The iPad is a portable display mechanism, it is not a full featured computer.

The iPod is a music player with some video and gaming features.
The iPhone is a cell phone that runs some software.

It's a question of legality and attitude towards customers. Sure you can bypass the restrictions and void your warranty, but why should it?

Throughout this thread I've talked about perfectly legal Apple supported ways of "bypassing" those restrictions. Though honestly it is more like following Apple's instructions on how to properly use their device. You don't need to jailbreak and void your warranty to do anything.

Why should 'free' be penalized?

Free isn't penalized there is plenty of excellent free software on iOS. Poorly supported and not carefully considered is penalized. Lets say that Squeak were used by just 1% of 3rd graders, roughly 100k children per year. And lets say the average kid used it for 20 of the 180+ days of school on average 1% of those kids had a problem on any given day. That's being very generous and I'm still at 110 support calls every school day. Who is fielding them?

What Apple will not tolerate is an answer of "no one". It is ridiculous to talk about Squeak as being vital to education and not be answer these sorts of basic questions about customer support. When Apple releases a developer preview who is absolutely unequivocally committed to making sure that when the patch goes live those 100k children's Squeak doesn't suddenly break.

Apple is stepping in to protect their customers from the exactly the kinds of irresponsible behavior that would lead someone to release Squeak to schools without those sorts of mechanisms in place. There is a term for people who can deal with unsupported, applications, have sufficient judgement and make the appropriate modifications to them to get them to work on their system, "developers". And Apple allows developers, to install whatever they want. And they allow the developer to agree to personally support whomever they want and have that person install their apps. What Apple doesn't allow is some developer to release his application without support.

There are applications that don't care about Apple's opinion of quality control, so why should they and the people who want to use that software be penalized because they don't share Apple's vision?

If the end user doesn't share Apple's vision, why use their management service? Apple allows you to opt out, exercise the option. The fact is though that overwhelmingly end users like the Apple management service and that's what creates the argument.

As far as applications, that's precisely what Apple does want. For far to long applications developers have had the upper hand being able to dictate to individual users who individually don't have the power to change application developer company behavior. Apple plays the role of a regulatory body. Application developers that don't share Apple's vision are free to go sell to Android or Windows customers. What they aren't free to do is ignore Apple's guidance while widely distributing their application to Apple's customers.

Apple not the end user makes the judgements. Apple decides that people who want high end programmable calculators can probably self support on an interpreter and thus allow ND1 to offer 3 of them; while 8 year olds can't and thus Squeak gets kicked to the curb until they think it through a little better. I think Dan Ingalls is a great guy, but that doesn't mean he has a support and quality infrastructure in place. And Apple is going to try and help him get one in place before he does move the software.

There are many legitimate objections to Apple's policy.

I agree. OTOH most of the people who complain about it, AFAIKT don't actually know what Apple's policies are.

This is getting a little

This is getting a little off-topic, so this will be my last reply.

Actually, there are many ways [to technically enforce policy]

There is no way to technically enforce the policies we have been discussing, except manually. I don't know why you're now discussing other policies.

Actually that is precisely what makes it less general purpose. Being less general purpose by definition means being less suitable for a wide range of activities.

By the same argument, desktop computers are less general purpose because they don't provide touch interfaces. The types of input/output a device has does not characterize whether it is general purpose. General purposeness is characterized by the types of computations that can be run by the user. Apple is trying to force the iOS devices to be special purpose via manually enforced policy, but they are not inherently special purpose devices.

These devices are designed from the bottom up to be excellent at a narrow range of activities. That is in fact precisely why Apple was successful with tablets after everyone else had a decade of failure. Apple did not make their tablet general purpose.

Being excellent at a narrow range of activities does not preclude it from being a general purpose device, it's just a general purpose device with excellent defaults and a good HMI. iOS is not special. It's just the Mac OS X kernel with a new UI. There's nothing special purpose about the device except Apple's policy of how someone ought to use it.

Free isn't penalized there is plenty of excellent free software on iOS.

You seem to have gone off on another tangent. This whole thread is about software that exposes a programming language to the user. Free software as a whole is really not penalized, but the subset free software that would expose a programming language is penalized because you have high costs and management overhead that make such software prohibitive. Such software is often shared by enthusiasts and power users that are not themselves iOS developers.

They have no recourse but to jailbreak their phones or pay prohibitive fees after already purchasing an expensive phone, just to use their phone the way they want.

Who is fielding them [support calls]? What Apple will not tolerate is an answer of "no one".

Sorry, that's plain ridiculous. You'll find thousands of apps that are not well supported. Apple doesn't do follow-up on quality of support (aside from limited refunds when apps don't work).

Their policy on exposing programming languages has nothing to do with support. If someone had a problem with the free app, they would just uninstall it. But I (almost) agree with you on one point: Apple is sacrificing power users at the altar of the average user. What people in this thread has been trying to tell you is that there are ways of achieving this without being totalitarian, which is unfortunately the direction Apple went.

Apple plays the role of a regulatory body.

Regulatory bodies have strict rules about transparency, discrimination, appeals, and so on which they must follow to the letter. How transparent is the submission and appeal process for apps? Utterly opaque, and policies are enforced on a whim. Apple is nothing like a regulatory body.

There is no need to try and canonize Apple. They've done good for the average user, but there is no question they have needlessly sacrificed a certain class of power user.

I agree this is offtopic for

I agree this is offtopic for this thread. If anyone wants to discuss it further, we can do it back in the appropriate thread.

This man's death...

What a loss!
As for me, I admire the man more than the company. He was a visionary: he could see the future of computing. He had enormous charisma and inspired many with his ideas and his life story. He was caring and wanted to work for the good, to have a positive influence on the world. He loved what he was doing.

His shortcomings, his tempers and excessive need for control, I did not experience them firsthand when I worked at Apple. But I believe they are more benign than the 'peculiarities' you can see in many of the top positions in this world. In the world of technology we like when things are open, the next iteration is predictable, and contributions are welcome. But how much does it matter if one company does not play by these rules, even if it is an influential and very successful one? As long as you and me are there, doing our own bit, not that much. I don't fear Apple. It is just one among many.

On the balance Steve Jobs gave much more than he took away.

May he rest in peace; I wish his wisdom may stay with us a little longer.

How about programming

How about programming languages? Did he have any direct influence on what Apple did in this domain?

OOP

Like many a great man, he had a fondness for OOP:

Objects are like people. They’re living, breathing things that have knowledge inside them about how to do things and have memory inside them so they can remember things. And rather than interacting with them at a very low level, you interact with them at a very high level of abstraction, like we’re doing right here.

Eric Schmidt on Steve Jobs

Eric Schmidt on Steve Jobs

He was so passionate about object-oriented programming. He had this extraordinary depth. I have a PhD in this area, and he was so charismatic he could convince me of things I didn’t actually believe.

I should tell you this story. We’re in a meeting at NeXT, before Steve went back to Apple. I’ve got my chief scientist. After the meeting, we leave and try to unravel the argument to figure out where Steve was wrong—because he was obviously wrong. And we couldn’t do it. We’re standing in the parking lot. He sees us from his office, and he comes back out to argue with us some more. It was over a technical issue involving Objective C, a computer language. Why he would care about this was beyond me. I’ve never seen that kind of passion.

Quite likely...

Alan Kay was one *his* inspirations and I would think the smalltalk-like extensions to C that is "objective-C" is likely a reflection of that. (Only guessing though.) Also, given how he gets deeply involved with what goes into a mac, it also seems likely to me that HyperCard/HyperTalk (both names automatically case corrected themselves as I type this on my iPad!) had his approval and their "plain English like" nature lives on in AppleScript.

I'm not sure whether he was even aware of Quartz Composer - the visual data flow programming environment that you can write screen savers for macosx in, among other fun things - but that is one of the less talked about "languages". Given its abstraction abilities called "macro patches" it does qualify as a language in my book. QC compositions can be played in QuickTime player and within safari on the Mac, and I doubt that degree of penetration would slip past him unnoticed.

... and, if absolutely nothing else, he might've had something to do with the name "Dylan" :)

(apologies for this totally unresearched and speculative post)

Research: Objective C was

Research:

  • Objective C was done in the early 80s by Brad Cox and was licensed later by NeXT.
  • HyperCard started in 1985 and was released in 1987.
  • Dylan started in the early 90s and was killed in 95.
  • Applescript came out in 1993. It hasn't evolved much in this century but continues to be maintained (and they put a nice Automator UI on top of it).
  • Quartz Composer was basically brought over with Latour (pixel shop) but hasn't really gotten much attention (its been eclipsed by vvvv).
  • Steve Jobs left Apple the first time in 1985.
  • Steve Jobs returns to Apple in 1997.

Safe to say Steve wasn't much of a PL guy.

Apple sponsorship of technology

Apple pays people to work on LLVM and clang, which is certainly important back-end technology for PL.

Apple supports some back-end

Apple supports some back-end technology, makes some improvements to ObjC to make it competitive, they probably build a nice JavaScript engine for their platform (though maybe not as nice as Google's). Its all very practical stuff, its all very much within the extreme focus of what they want to achieve.

Nothing wrong with that, works great for them.

Is using, say, ObjC, just

Is using, say, ObjC, just path dependence?

Probably, but then so is

Probably, but then so is most technology. The nice thing about the 60s, 70s, and 80s was that there were many great choices, so you could risk playing with something new.

I think it is reasonable to

I think it is reasonable to include the PLs that were licensed and supported by him, because they end up making the original thing bigger than it might've otherwise been. About the only platforms that Objective-C, and therefore any semblance of Smalltalk, is huge on (economically speaking) are MacOSX and iOS.

... I suppose Postscript is another in the "supported" category (Display Postscript in NeXT?), though we'd hardly consider it a language today since nobody programs directly in it. You use PS every time you hit "print to PDF" though.

Thanks. I really should have

Thanks. I really should have listed Postscript. What can I say, my brain is not as sharp as it used to be... (And I should say that you are not a true geek if you never programmed in Postscript.)

Point taken, the genius of

Point taken, the genius of Steve Jobs has always been appropriating the right technology at the right time in the right way.

I'm sure Javascript has more Smalltalk in it than ObjC. JS is bigger than ObjC at this point (economically speaking).

wikipedia has a detailed

wikipedia has a detailed history of HyperCard, and says that Jobs was the one who decided to pull the plug (which makes sense given when this happened).

Ironically...

It appears Apple's golden days in PL occurred mostly between the time Steve left in '85 and ended before he came back in '97. Hypercard, Dylan, NewtonScript, Applescript, SK8, Squeak, Fabrik, etc...all were done in that time frame. Apple used to do a lot of stuff with PL and now...incremental improvements to Objective C. After Steve came back, they became incredibly focused on their consumer products, and had little to do with developing better PL technology. They even shook off Java for what is essentially a monoculture of ObjC and C++.

According to Wiki, NeXT licensed ObjectiveC from some company which was created in the early 80s by Brad Cox.

That was my impression too.

That was my impression too. Which is why I also asked about indirect influences.

HTML

I'm not sure if you want to consider it a programming language but Nexus (the father of Mosaic) was invented on NeXT, so HTML.

I'd say though object oriented programing is his big influence. They are still today an objective-C shop. The paradigm for GUI programming and its ties with object oriented I think you can trace to Steve Jobs. So some influence on Windows C++ use and thus Java and C#.

Computing as a human experience

In terms of programming tools, if only Jobs had a native fondness and curiosity for the human experience of computing as it relates to programming, to programmers, to the degree that he did for device-centric personal computing, then we'd be living in an age of much less complexity for programmers.

As it turns out, his native genius was consumed with and focused on how humans interact with computing devices, beginning with the device itself and the software that makes it useful for personal computing, for people, users.

His was the art of making computing a human and personal experience.

Personal computing has lost a driving and creative force. Fortunately, his vision will continue to propel personal computing forward. Very few people have this level of impact on such a large scale. Steve Jobs is dead, but his vision lives on.

C

We need a Steve Jobs for

We need a Steve Jobs for developers (or more if they exist already) to push us to the next level (incidentally, if I find that person I would be willing to work for him for free). That's why I find him incredibly inspirational even if he didn't have any direct influence on our field.

Who comes closest to this

Who comes closest to this ideal?

Mostly we are more Wozniak

Mostly we are more Wozniak than Jobs.

But if you look at some of the less academic programming language designers out there, they have the right qualities: they focus on the experience and not the tech, have philosophies oriented toward people, are very strong about getting what they think is right. Think Matz and Guido among others. We sometimes look down at these languages because they lack the advanced tech features that other (less popular) languages have, but we miss the point, the languages are successful because of the human-focused experiences they provide.

Elegance

I figured these are the people you had in mind. What about Kay? Papert (LOGO)? But all of these great folks are "one hit wonders." Jobs had a string of design successes that took over the world.

I don't really agree with the claim that we are more Wozniaks (not that I see that as anything but a compliment...) Language design typically involves a lot of attention to elegance. That's certainly a "human factor." Of course the humans involved are technical, and the goal is to provide precise and expressive semantics. Given that, I raise you Iverson, with "notation as a tool for thought."

Wozniak sees beauty in

Wozniak sees beauty in technology, that is his passion, he is a geek's geek. Some people see beauty in math. Some people see beauty in human experiences. Then there are many kinds of elegance, not just the human factor elegance. So yes, PLDs care about elegance, but no, its not always (or even often) the human factors elegance. Elegance in any of these forms is useful (e.g., semantic elegance), but only indirectly to the human experience (great semantics aid great human experiences, but are not necessary or sufficient).

Here is were we diverge.

Here is were we diverge.

Probably. I realize my

Probably. I realize my opinion about this is controversial, but it is hard to deny the effect of "worse is better." Nice discussion though.

Oh, I think your view is the

Oh, I think your view is the more common one! I just think programming languages are primarily for programmers, and I find that more often than not we are human (contrary to public perception!).

Case in point: I think my response to the verbosity and repetitiveness of Java is totally a human factors issue (I am pretty sure compilers don't mind). The fact that the semantics make it hard for humans to reason is also a human factor issue. The fact that the semantics are not powerful as they can is also more of a human issue, since it makes the human task of programming harder and less pleasant. The inelegant fonts and unnecessary bold fonts in Dietel and Dietel make Java even worse for students, again for totally human reasons.

Sometimes you don't

Sometimes you don't understand the problem well enough to do great semantics, but need to create a solution. The solution can actually be successful (if not optimal) with poor semantics, and gives you experience to understand the problem and solution better, and to improve. If you try to do the "right thing" at the very beginning, you might wind up with nothing at all. Lots of things in Java look bad only in retrospect.

Sure. But I don't think this

Sure. But I don't think this implies that elegance and a design sense do not play a role. I don't buy the notion that language design is primarily about formal semantics, I think it rarely is. Having a nose for good syntax and expressive constructs surely play a role. Don't let the denies by designers fool you; they tell more about their peer group and super-ego than about what is actually going on!

We are talking about

We are talking about semantic elegance right? It does play a role, just not a core role. It is definitely useful but if you don't have it, the game isn't over. I'm not taking either extreme position of "always required" nor "never useful," just "sometimes useful."

I am actually thinking about

I am actually thinking about elegance on all levels, including syntactic.

Then yes, a successful

Then yes, a successful language requires lots of elegance in many forms. Not just semantic or syntactic elegance, they need elegance in their libraries, tooling, UI, even philosophy. Greatness will lead to a great programmer experience, but feel free to skimp in any area if necessary (lots of compromises, embrace Gabriel's worse is better).

I know this isn't a satisfying answer, it doesn't provide a very clear roadmap to being great. I would argue that the clear roadmap doesn't really exist.

And again we converge...

And again we converge... Agreed on all points.

Sadly I feel there is little

Sadly I feel there is little correlation between the elegance of a language and its success. If anything, it seems that language success is more closely akin to the success of a fashion fad (e.g. bell-bottom jeans) than anything else. All it takes is for the general mindset that a language is "cool" to set in, add a viral multiplier, and watch the epidemic spread.

Now I did say "little" correlation, because there is some. If a language is "cool" because it has elegant solutions to practical problems, that might be one stimulus. But the other stimuli are far more powerful.

If 75% of the engineers in your office are using language X because its the new hotness, what are the odds you will be?

What gives me hope that programming languages do eventually make progress on some elegance criteria (as opposed to fashion, which wanders aimlessly from V-necks to turtle necks and back arbitrarily in some detached false reality of unmeasurable taste) is a general "rising of the tide" or an increasing level of expectations. E.g. nowadays a statically typed programming language must at least be sound (i.e. the type system actually makes some guarantees about runtime behavior). To offer a unsound static type system would be basically suicide today. And nowadays well-known techniques for describing the type system formally and proving its soundness exist. Thus the "bar" is much higher for any new offering, successful or otherwise. Even if were to pessimistically assume a completely random selection for the most successful language of an era, the raising of the bar ensures the winner will at least be incrementally better than what was acceptable in the previous era.

So how do we raise the bar? I say, we relentlessly keep designing new languages until we see unacceptable failure patterns (and clear success patterns). When the community sees the obvious failure patterns and refuses to accept languages that have them, progress occurs.

Soundness?

To offer a unsound static type system would be basically suicide today.

Funny you should say that, aren't you at Google, too? ;)

Seriously, recent mainstream languages are still littered with null pointer exceptions, unsound subtyping rules, NoSuchMethodErrors. Java is approaching 20, but I cannot see much progress having happened since then. Sure, they don't segfault anymore, but when I here soundness I expect more.

Hi Andreas. Java is hardly a

Hi Andreas. Java is hardly the best static type system you could find, but at least they tried. Better examples are ML and its offspring. Haskell. C#. These languages are designed by the A team of PL, which really raised the bar. The fact that the B team who designed Java is even trying to imitate the A team is a sign of progress, a big leap ahead from the languages designed before formal type systems.

P.S. A static type system guarantees that a well-typed program does not go "wrong". We define a type system and a semantics in the form of evaluation rules. If the semantics includes evaluation rules like dynamic exceptions, there is no contradiction, there is no violation of safety. If NullPointerException, ArrayIndexOutOfBoundsException, ArrayStoreException, NoSuchMethodException are part of the defined semantics, then I wouldn't consider such a type system "unsound", it just doesn't statically guarantee freedom from every undesirable program behavior--no static type system can, after all.

B-team?

I don't think you can write off Java as a "B-team" language. Relative to that, almost every modern mainstream language would be a D-team language.

C#, having the advantage of hindsight, is a little better than Java, but only slightly. ML and Haskell I didn't count, because (unfortunately) they are not mainstream. Both of them are also older than Java, so could I imply that we are regressing? No, I think "academic" languages have to be considered separately -- obviously, language researchers are far ahead and care more about these things.

Re soundness: sure, that is the purely technical definition. But you can hide just about any crap behind that. Or even untyped languages. A language like Java is memory-safe, but still not type-safe in any interesting way. Bob Harper once argued that even C is sound wrt its own crappy semantics. I don't quite agree with that, but it shows that if interpreted in a too narrow technical sense, soundness becomes rather meaningless. If that's all you meant then okay, but I'm afraid I have higher expectations when I hear somebody dropping the word.

Lazy thinking

Again this is just lazy thinking: a language is popular for reasons you don't understand so you call it fashion or a fad. That people are willing to accept languages that aren't perfect isn't weird at all as there is no perfect language. On the other hand, the reason one's new language isn't gaining traction is due to staid thinking or irresistable momentum. This is also lazy thinking.

We raise the bar by not deluding ourselves with our own bias.

Hi Sean. Your post was

Hi Sean. Your post was unusually trollish. I'm not sure why you went there.

Apparently I am engaging in "lazy" thinking for recognizing a more complex situation than I or anyone else understands and offering only a few high order bits. The truth is that the popularity--and tenacity--of some languages is mostly a mystery. Why isn't Latin the only spoken language in Europe or why isn't there a single best haircut? After all, Latin is a far more regular language than almost any spoken today (even the ones it gave rise to) and had the power of the Roman Empire behind it. As for haircuts, go back 20 years and try to explain MC Hammer. Seriously. You think fads in fashion are rooted in scientifically measurable properties of style and that its change over time is somehow raising the bar--whatever that means?

Or maybe you think that fads in fashion and hairstyles and the seemingly random wandering of style have nothing to do with we engineers and scientists, who are immune to such human factors as what is "cool" and what is "in"? If so, then I think you might want to reconsider your own bias.

I know, it's scary to consider that you actually have no idea why things actually become popular. Sure, there are some low order bits like how nice it is at making jam or how it looks next to your microwave, but the fact is that the problem is actually too complex, and humans are too fickle and unpredictable to put into a computational matrix which will predict the formula for your language's success. For to accept such is to live in terror your ideas with be discarded without your control. To realize that you have only 10% influence over how well your language will do. How frightening!

Externalism

To connect this in another way to the topic at hand, we see in this thread various examples of how contingencies beyond the quality of a language per se affect its adoption and survival. We tend to all too easily neglect such factors when we discuss the history of programming languages.

I see your point and

I see your point and apologize for my trollishness.

There are reasons behind every single point you make that go beyond fashion. Geographic isolation explains away Latin, different hair textures and culture explains away best haircut. Perhaps they are not scientifically measurable, but they are explainable. I don't think everything needs scientific measurement to be explained, and as anyone here knows I'm a big proponent of design.

Actually, there are plenty of ideas on why things become popular, but the analysis is not always amenable to the scientific method, sometimes its more about design and design thinking, something that is very hard for us engineers to accept.

In Sean's defense, I believe

In Sean's defense, I believe he was the one who first pointed me to this paper: An empirical study of programming language trends. I've actually been thinking (and reading) about this issue -- and both sociologists and economists (and agricultural scientists, linguists, marketers, ...) actually have fairly good ideas of how and why things become popular. In particular, check out Roger's "Diffusion of Innovation," one of the best books I've read that is essentially a guided literature survey of a dominant perspective on the field.

If you're asking for an

If you're asking for an individual, I'd vote for Guido van Rossum.

Also a bow to the Haskell folks for an amazing combination of power and elegance of expression in language design - possibly the best to date.

Whenever I talk to someone

Whenever I talk to someone like Simon Peyton Jones, I'm absolutely floored by their brilliance and insight. However, their brilliance is in an area that is only loosely related to human factors.

The problem with Haskell is that their designers are very focused on mathematical elegance rather than human concerns. Actually, this is intentional and not really a problem to them, they are incredibly focused and successful in their goals. Haskell wasn't designed to be an everyman's language, and should not be judged as such.

closest

I'm going to give the unpopular answer and go with Bill Gates and the popularization of BASIC. While clearly Apple Basic did a lot, Bill Gates and GW Basic had much larger influence. He kept it up through Q-Basic and then Visual Basic which was the language that brought easy drag and drop programming to millions.

The move away from Basic as a scripting language for Windows has created a situation where millions don't have an easy way to get into programming. There is no children's / beginner's language that is essentially universal.

Stephen Wolfram

Stephen Wolfram recounts trying to talk to Jobs about computer languages. Didn't seem to fare well, especially before a date.

I used to see Steve Jobs with some regularity in those days. One time I went to see him in NeXT’s swank new offices in Redwood City. I particularly wanted to talk to him about Mathematica as a computer language. He always preferred user interfaces to languages, but he was trying to be helpful. The conversation was going on, but he said he couldn’t go to dinner, and actually he was quite distracted, because he was going out on a date that evening

The story seems plausible

The story seems plausible enough, but ask yourself, "how would I treat some tart who fancies himself the next Isaac Newton?"

He always seemed just another salesman CEO to me

To start a tech company, you need a techie (Woz) and a salesman (Jobs). Techies don't care about titles, so they get called anything. Salesmen do, so they get called "President" or "CEO". (The first company I worked for had this structure.)

I think this interview with Woz from Founders at Work pretty much nails it, more from what Woz doesn't say than what he does.

As for event loops, I hate inverted control with a passion.

As for event loops, I hate

As for event loops, I hate inverted control with a passion.

I hear you, brother. I think programming changed dramatically beacause of this, and not for the better. I think it made programming harder for hobbyists, made things like VB (originally) so inviting, and made applications look mysterious to newbie programmers. (Of course, then the web came and event drive programming became unavoidable for awhile.) I also think the machines became at the same time, and for related reasons, less accessible, in the sense of bare metal programming, and OSes and APIs became too prominent in our code.

In defense of event loops

I don't know what's bothering you about event loops. They're a natural style for many of the apps written in the 80s and 90s. A user clicks a button, something happens. Are there even any alternatives to event loops for these classical GUI apps?

Processing is a simple

Processing is a simple illusion that exposes only the body of an event loop to programmers. Designers and artists love it just because they don't see the loop, main, they don't have to define their own abstractions (like lambdas), and so on.

Of course there are

You might as well ask if there is any alternative to event loops to read bytes from disk files or the network. Of course there are; the OS (or the C library) is a coroutine of your program, and you invoke it when your code needs input.

OS X, OpenBSD, NeXTStep, Mach

(I've owned Macs since 1986, despite working for Intel at that time. Intel actually had no problem with our lab buying a Mac or two, as it was just a tool. Much as Motorola presumably bought IBM PCs and clones for specific purposes, despite the CPU. I decided not to buy a Mac when it first came out, January 1984. Three days after the intro, after playing with one at the old Computerland in Los Altos, CA, I instead put $2500 into AAPL stock. That could've bought all the Macs I've owned since then, with money left over. I was sufficiently impressed with the acquisition of NeXT and Jobs in late 1996 that in early 1997 I bought $16K worth of AAPL. I sold some at $200, bought back in $120, sold again at $230, have kept some all the way up to around $400. So, on average, this has yielded about 60-70 times what I put in. Thank you, Steve Jobs!))

I think the legacy for PL work was from NeXT's foundation in OpenBSD, the GNU tools, the Mach kernel, and that whole Sun-like approach. With some differences. I understand that programming tools from the NeXTStep world, like AppBuilder, were also important.

And then when Apple was trying to decide what to replace its creaking, complex System 6, then System 7, with something, they looked closely at Gassee's "Be" company, and at Jobs's "NeXT." They picked NeXT.

In a sense, the Mac with OS X then became the "Sun-intosh," a Sun-like workstation "for the rest of us." Sorry for the awkward coining.

I can remember in the mid-90s a bunch of friends of mine had Suns at home, and I came pretty close to buying a Sun pizza box myself, as the creaky System 6 and System 7 releases were pretty darned crufty. But the Sun pizza boxes still required a fair amount of hand-holding. (Linux on x86 was another option, but was still in an early stage for end users as of mid-90s. Most of my friends who had Sun boxes at home migrated to Linux. But, today, most of them use MacBookPros most of the time.)

The new and powerful Macs with OS X brought to a consumer-purchasable machine a lot of the same tools that Sun users had, but with all of the old legacy apps through Rosetta and other code-redirecting tools. Some of the migration tools pre-dated Jobs' return, things like the 68K-to-PPC translation tools, but the whole process has been impressive. Now the tools are possibly about to hurt one of my other big investments, Intel. The tools to rewrite/recompile/etc. code from Intel CPUs to ARMs are making it possible for Apple to even consider replacing Intel CPUs with ARMs in laptops. ("Gulp!")

The success of this approach, and the popularity of Macs running OS X in research environments, made the GNU tools, and early releases of GHC, for instance, possible. No longer are Mac users the last to get new versions of a lot of interesting languages.

I think is a legacy of Jobs and the NeXT purchase for programming tools that tends to get overlooked with all of the focus on the Apple consumer products. The original plan for the NeXT cube was that it would be a "scholar's workstation." Didn't quite work out. But in the past 10 years or so it certainly has.

--Tim

The Last American

Nice tribute. Phrased differently, in onion style.

(Though I miss stuff. The Mac, the iPod, the iPhone, Pixar, the first mouse enabled PC, graphical and sound software.)

I was one of those

I was a NeXT user starting around '89 or '90 till '92. Switched to SunOS/Solaris at school and starting using Linux as a secondary OS around '95-6, because it was some of the features of Sun at a price I could afford for home. I loved my WindowMaker on Linux and when OSX 10.1 came out I went over to Apple. So yes, you described me well.

-- One small correction. Rosetta was for the PPC -> Intel transition. It was the classic environment that was the bridge between OS9 and OSX.

The Flash decision? Its

The Flash decision? Its impact on the future of HTML5?

Java

Even better: look at how they basically buried Java in consumer. I mean, Java for consumer software was already dead at that point, but they had the balls to bury it by not allowing it to go anywhere near iOS. I was pretty bummed at the time, but it looks like the right decision in retrospect (but then Android comes back with Dalvik, WP7 is based on Sliverlight...).

Killing the floppy drive a the right time paved the way for USB and network-based storage, killing the CD drive now will do about the same. Post-97 Apple is pretty good about killing things to move the industry forward. The Flash decision is just another example of that.

They should get a medal for

They should get a medal for that.

Perhaps we should keep in

Perhaps we should keep in mind that at first, they effectively buried everything other than native Objective C. A medal for an attack on the ability of developers to choose the right tools for the job? I'm not sure that's a good idea.

This particular edict was a business decision, through and through. Apple wanted to segment the market and force developers to decide where their resources would go -- iOS or others. They succeeded.

I've never really understood anti-Flash fever. If you break down Flash into its component pieces (toolsets, compiled format, virtual machine), it's a pretty reasonable way to engineer a certain set of capabilities. HTML5 has a long way to go before it can do what Flash has been doing for years. And the power/resource issue? That's just implementation. There's nothing in the Flash standard that says that playing video should take a lot of CPU. You can argue that the implementations should have been better, earlier, but it's unfair to make the unspoken but assumed argument (as so many did) that Flash couldn't be efficient.

Since we don't have something like the JVM in the browser, what we're left with is a weakly specified hodge-podge of standards that result in a computation environmental that is woefully underspecified with respect to something like the JVM's execution specification. What we're left with is JavaScript-as-VM, which it was never intended to be.

This particular edict was a

This particular edict was a business decision, through and through. Apple wanted to segment the market and force developers to decide where their resources would go -- iOS or others. They succeeded.

While probably true do we have concrete evidence for this?

Usually lock-in is achieved through frameworks and APIs. Doing it through a programming language is rather unusual (and not as effective, I'd think).

circumstantial evidence

While probably true do we have concrete evidence [that Apple wanted to segment the market and force developers to decide where their resources would go -- iOS or others. They succeeded.

Is the standard of evidence to be "preponderance", "beyond a shadow of a doubt", or "by confession"?

Apple's explanation at the time went something like this: User interface standards and standard behaviors of apps generally are very important to this platform: the user comes first. Apple had experience in the past wherein their native platform was used as a target for interpreters or translators of other languages. Reliably, applications written in those extra layers would not mesh smoothly with the native platform -- some details of the user experience did not conform with the UI standards. For the protection of the users, we don't allow that.

I hope I didn't put too many words in Apple's mouth there. That's what I recall them saying.

At the time - and still - mobile is shaping up into an incredibly segmented space for developers. Projects are frequently forced to choose their priorities, like iPhone first or Android first.

When that kind of fragmentation has occurred in the past (e.g., Windows vs. MacOS vs GNU/Linux) markets reliably responded by building "cross-platform development toolkits". These provide some degree of "write once, run anywhere".

Much as Apple suggested: the history of cross-platform development toolkits shows that, inevitably, the applications generated could come close but rarely ever fully match the UI standards of each platform. Sure, the menus would look "Windowy" on Windows and "Macy" on MacOS but the consistency was shallow. The problem was compounded when significant changes or additions were made to the underlying platform: any apps built using the cross platform toolkits had to wait for and rely on the toolkit to "catch up" to the changes in the underlying platform.

Adobe is but one example of a company that, early on, tried to build a cross-platform compiler for iPhone. When Apple issued its edict, it was commonly believed that the timing was specifically driven by an attempt to shut down Adobe's effort:

http://www.computerworld.com/s/article/9175157/Apple_blocks_Adobe_s_iPhone_end_around_plans

Another historic source of cross-platform toolkits is the world of free software. Free software cross-platform toolkits benefit greatly from using open source development practices. Various contributors can each focus on the ports to various native platforms, with the shared cross-platform project accumulating their efforts. As a pleasant side effect, when the applications written on these toolkits are themselves free software, those applications don't depend on an API provided only by a proprietary system (i.e., don't directly use the iPhone APIs) but operate at a level of abstraction using an API that is portable even to all-free-software platforms.

Apple shut down those as well.

I would say then that we have "proof by confession" with the caveat that Apple didn't use the word "lock in" explicitly -- instead offering platitudes about protecting the user experience and anecdotes and analogies about their past experience with cross-platform applications.

Usually lock-in is achieved through frameworks and APIs. Doing it through a programming language is rather unusual (and not as effective, I'd think).

I think it's effective. It's insidious:

To work around Apple's edict a developer must do one of two things:

a) Arrange to port Objective-C to all platforms targeted by a cross-platform portability toolkit. This would be impractical.

b) Arrange to compile cross-platform apps into (among other things) Objective-C and sneak the result past Apple. This would be a breach of contract.

Apple is aware of the resulting high barriers and choses not to relent on them. The costs they impose on third party developers who want to write apps that run on more than just Apple platforms is significant and deliberate.

Arrange to port Objective-C

Arrange to port Objective-C to all platforms targeted by a cross-platform portability toolkit. This would be impractical.

I am not sure you need to target all platforms at once. But anyway, wouldn't it be possible to do this once, and sell to developers? Seems like a viable business model.

But anyway, wouldn't it be

But anyway, wouldn't it be possible to do this once, and sell to developers? Seems like a viable business model.

In theory, sure. In practice, people try. This seems to be the main port of Objective C to Android:

android-gcc-objc2-0

That project started in January 2010 -- three years after iPhone was announced -- and hasn't progressed much since July of 2010.

That's only the compiler and it doesn't come with any libraries to help with UI stuff. Here's an example guide:

android-objc

Note that it ends with an ambitious wish-list of what else would be needed to make this really useful.

So, to turn this into a product you could ship and support for money, there would be a lot more work beyond the compiler port. Also, you'd need folks to support the compiler port. And that's only to hit Android -- not iPhone or any other platform yet.

I find the scope of the needed work hard to estimate accurately. Would you accept a wild guess of 1-2 years work by several people to get something mature and stable enough to make a product?

If so, that would give the developers sunk costs equivalent to several hundred thousand dollars or more before they can collect a dime of revenue and before their work is of much utility to anyone. The model would have to rely on lots of sales starting upon the first release of the product. All would be lost if, in response, Apple let the effort play out a bit and then shut it down like they did to Adobe. Upon release, there would be few or no apps using the thing and it wouldn't benefit in any simple way from 3rd party libraries written to native platforms. In these days where a lot of start-ups compete for 5-figure funding, that's a tough sell.

Here's another example of hackers toying with the idea. This time the GNUStep project (a free software project inspired by NextStep):

discuss-gnustep mailing list thread

They get stuck at the same place: they can kinda sorta get the compiler going and then it gets stuck looking for GUI run-time support.

In contrast, Adobe had an existing portability platform with lots of apps and 3rd party libraries -- all they had to do was the translator to Objective C: but Apple won't let them.

Lots of third parties could have provided Java on iPhone -- an easy port and an easy port of supportive UI toolkits and lots of apps and libraries already. Again, Apple won't allow it.

Apple didn't make it impossible to build an iPhone-friendly cross-platform toolkit -- just prohibitively expensive.

Thanks for the data points.

Thanks for the data points. I am still not sure I understand why this investment is considered prohibitive. Many stranger projects get millions of dollars and are considered success even without any revenue stream. Is it just the case that the expense is large and the return in terms of programmer productivity or what have you considered insufficient?

This is based on a misconception

This whole line of discussion is based on a misconception. Even when Apple's programming language restriction was in force (it isn't any more), the restriction was to C, ObjC, or C++, never to ObjC alone.

not really much of a misconception

ObjC subsumes C, and relative to the details relevant to the discussion, C++ features don't change anything.

You contradict yourself

Several people on this thread, including you, claimed that porting ObjC to other platforms would have been necessary to make code written for iOS portable, based on the mistaken claim that (back in the days of the language restriction) iOS code had to be written in ObjC. I was pointing out that, since C and C++ were always allowed and have always been portable to everything, the whole discussion of "porting ObjC" was a red herring.

(Of course the hypothetical portable app or framework would still have had to adapt to different system APIs, but that's true of any portable code and nothing to do with Apple.)

no contradiction

I don't think there's any contradiction here. I didn't understand where you were trying to go with the observation about C / C++.

The relevant Apple APIs for UI use Objective C.

You can not wrap these up in a compatibility layer with a normal C or C++ interface because the terms of service forbid it (or at least many people quote a part of the terms that seems to clearly forbid it).

Given that, you can not write a strictly C or C++ cross-platform toolkit that targets iPhone among others without violating the terms of service.

Presumably the goal was to

Presumably the goal was to rescue curly braces from imminent extinction.

I love flash

I love flash. I want to see flash go back to being the web standard for vector drawing and get out of the video business.

The problem with Flash is not the spec I agree there.

1) Flash is proprietary. The implementation is the product. The spec is of academic interest.

2) There are natural conflicts of interest between users and advertisers. Adobe sided with advertisers. That hurt Flash and justifiably so.

Objective-C is like Java compared to Dylan

Unfortunately the acquisition of NeXT killed off Dylan, cementing a much inferior language (Objective-C) as the lingua franca of OSX development.

Dylan never took off. Its

Dylan never took off. Its not like there were many Dylan programmers to abandon.

It never had a chance

But my point wasn't about market share. IIRC, Dylan has a pretty good IDE (especially in the 1996 time frame) and were just starting to seed out Dylan preview cds to developers when NeXT came around and killed it for good.

But I was really just making a snarky comment comparing Objective-C to Java when you put Dylan up against Objective-C.

Objective C predates Java by

Objective C predates Java by about a decade. I'm sure going with Objective C was the path of least resistance, considering they were going to build on NextStep. Apple wasn't that healthy back then, they had to focus.

Dylan

As of the preview CD releases, Apple Dylan:

1) Wasn't even self-hosting (i.e. was still a prototype written in Macintosh Common Lisp).
2) Was horribly bloated (by virtue of MCL lacking anything like a tree-shaker to reduce image sizes and, being a prototype in development, having maximal source/debugging information in the image).
3) Had horrible codegen characteristics (did I mention that it was a prototype written in Common Lisp)?

The Newton's OS was supposed to have been written in Dylan. Dylan was so far behind schedule and so far away from meeting its objectives that the Newton team had no choice but to develop NewtonScript on a shoestring both in terms of time and staff. The result was actually a pretty decent little actor language for a severely constrained embedded system.

None of this had anything at all to do with NeXTStep, which wasn't even a glimmer in Apple's eye, at that point.

The fact that all of this went on while languages like Standard ML and OCaml already existed and did a better job of meeting Dylan's objectives than Dylan ever could significantly contributed to my—the only MCL support engineer MacDTS ever had—then-nascent disillusionment over the purported benefits of dynamically-typed languages.

Having been there when Dylan was being developed...

...I can assure you that the NeXT acquisition didn't kill off Dylan. Dylan was already dead and buried by then.

Your right, it was probably more Java than anything else

I recall some folklore about an Apple engineer backstage at the announcement of the NeXT acquisition quipping that it was the final, final death nail of Dylan since it was obvious that Objective-C would play a big part in the future of Apple's OSs.

The wikipedia article on Dylan didn't mention anything about that, and I'm not sure where I read that (maybe over at opendylan.org somewhere).

Functional Objects might have a good run with their Dylan, but the Java juggernaut was in full force by then and .NET was coming around the corner.

It would have been interesting to see if Dylan could have been revived at Apple if Apple's fortunes had turned around without the NeXT acquisition.

Alternative history. Without

Alternative history. Without NeXT, no Steve Jobs, Apple would be a very different company today if it survived at all, it would probably have been acquired by Sun, which would have been acquired by Oracle anyways.

The 90s was a great time for OO PL. It would be nice to bring that back, but we are stuck in the yet-another-old-style web language rut.

Dylan vs. Obj-C

I don't really see how you compare Objective-C and Dylan. Dylan is a high level language. There are tons of high level languages which call the Cocoa binding library. I'd say 6 man months to create a version of Dylan with full support if anyone wanted to do it.

Objective-C is a system's programming language. It is very very fast and Apple has decided to focus on performance at the OS, the hardware and the computer language level. I'm a high level languages guy for my code but I certainly like it when other people go through the misery of constructing software in low level languages for my benefit.

The plug-in problem

It's certainly true that plug-ins (like Flash in this case - you asked...) add complexity to an already complex problem (safe, reliable, compelling, capable web application experiences running inside a modern web browser).

It's clear that Jobs was a big fan of simplicity (see his Apple University manifesto). Accordingly, and simply, if you remove problems from a set of problems then you have less problems to solve en route to some desired result (like great user experience with limited security flaw potential, no hangs, no memory leaks, no installation experience, no chronic updates to patch flaws, no poor power management, etc...).

HTML5 promises to replace the need for third party binary browser extensions to add missing capabilities that benefit human web surfers. This has yet to be proven, of course, but time, as always, will tell.

Obviously, Jobs' decision to not support binary browser extensions in iOS has impacted the adoption of and bets made on HTML5 application development going forward. This strategic shift extends beyond iOS, of course. Take "Windows 8" for a recent example. You can write client applications in HTML5 (with C++ and/or C#/VB and/or JS underneath) and IE10 in Metro mode will not run plug-ins at all.

The iPhone mandate decision.

Let's keep it in perspective though

The ban on third party programming languages and frameworks was only a temporary mandate that has long since been abandoned. It gave native apps a good head start, and thus seems to have served its purpose.

That said, though, while I'm inclined to agree with what Jobs said about the inferiority of apps built with third party frameworks (you only have to compare native Mac or Windows desktop apps with the crappy user experience invariably delivered by the likes of Java, Qt, or X11 to see that), I also agree with those who suspect that the real reason, or at least the most important one, was to keep Adobe out for a while. Not that I have any complaints about that; as a regular Web user and occasional developer, I'm happy to see the crawling horror of Flash stomped on by fair means or foul.

you only have to compare

you only have to compare native Mac or Windows desktop apps with the crappy user experience invariably delivered by the likes of Java, Qt, or X11 to see that

It's worth pondering to what extent this is a technical necessity, and to what extent it is the result of design decisions made by those providing the native platforms.

Reasons

Three reasons, I think.

First, third party GUI toolkits tend to aim for too much cross-platform uniformity; that is, they're designed so that apps made with them will look the same on all target platforms, instead of looking and feeling like those platforms' native apps. Because the people who design the toolkits don't really understand how important UX conventions are.

Second, third party toolkits don't have anywhere near as much design expertise behind them as the native GUIs (I'm talking about graphic design here, not program architecture), simply because they can't afford the kind of design teams that companies like Apple and Microsoft have.

Third, if someone uses a third party GUI to build their apps, it's probably because they're already using multiple OSs themselves, which means they're much more likely to be the kind of programming geek who has little interest in aesthetics, and thinks user experience is just unimportant shiny bells and whistles. More artistic types, who aren't interested in playing with multiple systems for their own sake but are likely to be better at GUI design, generally pick one OS and stick with it.

The ban is still not gone...

...you still can't download Scratch for iOS, because it offers children the ability to share their toys with each other. This is malign, no matter what the alleged business rationale for it is.

I agree this is sad, and

I agree this is sad, and other companies seem to be copying this policy.

BTW, TouchDevelop for Windows Phone 7 allows you to share scripts now. At least there is some hope.

scratch

"because it offers children the ability to share their toys with each other"

Come on. Do you really think that Apple is anti children sharing? Apple has a no interpreters or compilers policy on iOS. Scratch is clearly an interpreter. This has nothing to do with some Apple hatred of Children.

If children want to share Scratch application, the proper platform would be OSX. With a version of scratch that runs over the developer SDK creating iOS provisioning files in a way that is seamless to the children. Cross compiling like that is how the Lego environment works, which is aimed at kids.

Further, Scratch might be the kind of application that they would make an exception for, if there was educational demand. If someone created a clean version that runs fully sandboxed and thus doesn't create a door to bypass the entire security system... I think it would likely be approved.

Bombs don't hate children

Bombs don't hate children either, but it doesn't stop the bombs from killing them.

What if the kid gets an iPad and not a Macbook Air? Are they just doomed to be a content consumer forever?

raising content consumers from birth

Are they just doomed to be a content consumer forever?

That is the general idea behind all these treacherous computing technologies, yes. Computing is too dangerous for the establishment to permit hoi poloi to enjoy it without restriction. The intent is that children born today will not enjoy the full benefits of computing except as approved by various licensing authorities.

system compiler

Are you listening to yourself? The company that shipped a system compiler, freely with every OS and has about 2 dozen high power interpreted languages bundled in with their standard computing product and funds MacPorts so end users can easily download, compile and use another 100 or so for free is opposed to unrestricted computing technologies.

The devices that have these restrictions we are discussing don't even ship with keyboards or interfaces designed to support keyboards. They discourage all sorts of wide open content creation. They are not primary computers.

I taught my daughter to program on a Mac (acs logo). It was a terrific environment for educating children. As an aside, the latest version has full Applescript support so now children can control their computers completely from within the logo environment and other applications (including remote ones) can interface deeply with the logo environment. I haven't seen that done on any other children's language ever and that makes all sorts of things possible for educators.

re: system compiler

To the best of my knowledge, even all of Apple's allegedly general purpose computers now include DRM (namely the Mini DisplayPort). Apparently your role as content consumer is so important that Apple wants to make sure there are some kinds of programs you simply can not run on their machines.

Over time, Apple products have shown a trend from 0 restrictive features, to now having DRM features in every computer. Why shouldn't we expect this trend to continue to get worse?

Many systems that have DRM, including Apple's, allow plenty of programs to be written. They can be fun and interesting places to learn the basics of programming, as you say. Still, Apple has made clear that they intend to limit your right to program your own computers. That's a lesson of a sort, I guess!

DRM

Apparently your role as content consumer is so important that Apple wants to make sure there are some kinds of programs you simply can not run on their [OSX] machines.

Like what? Can you name any.

Over time, Apple products have shown a trend from 0 restrictive features, to now having DRM features in every computer. Why shouldn't we expect this trend to continue to get worse?

Because at the same time they have loosened on other areas of DRM. For example iTunes music used to be all DRM, today it is all free. They won a major battle with the record companies on that one. They have been fighting hard for 5 years against the Blu-ray DRM restrictions which is why they haven't included a built in Blu-ray device. And they have broken through Flash which was a proprietary format, and have done a lot to open the web.

Apple is fairly consistently pro-freedom unless there is a good reason not to be. I think you can expect them to drift in a free direction except when they have good reason to lock things down more. So I would expect a mixed bag not 1984.

Still, Apple has made clear that they intend to limit your right to program your own computers.

Actually no, they haven't. They have spent a ton of money making it easier for people to program their computers and given away the results for free.

content consumer

-- What if the kid gets an iPad and not a Macbook Air? Are they just doomed to be a content consumer forever?

Well yeah. That is one of the key differences between the iPad and the Macbook Air. The iPad is meant to be a secondary device not a primary device. I have a daughter who has had an iPod, essentially an iPhone without the phone part, for years. She wouldn't even think of trying to manage the device without using the iTunes interface.

iOS devices (iPad/iPhone/iPod) are not self contained. Up until iOS 5 you couldn't get past the setup screen without having another computer manage your iOS device. It is possible today but, blech. If a kid gets an iPad and doesn't have access to another computer, my guess is his experience on the iPad is going to be bad in all sorts of ways. It will end up in the seldom used toy pile.
_____

As for the bomb analogy. I think there is a big difference between collateral and intentional damage. There is a moral difference between the deaths in Haiti from the earth quake and deaths in the Holocaust. The claim of the GP was that Apple was deliberately preventing children from sharing via. Scratch because they didn't like the idea of kids sharing. I don't see how that sort of hyperbolic language creates anything more than useless heat.

Collateral damage is the

Collateral damage is the right analogy. Apple doesn't hate kids, but their policies have indirect consequences that will create substantial damage in the future if we don't do something.

You know what, these tablets are getting pretty good, they do mostly what you want and I can totally see a future where the iPad is mainstream and the laptop is niche. The iPad is totally NOT a secondary device even today. It lives and breathes on its own, the only time I even plugged mine into another computer was when purchased at the Apple store (haven't upgraded to iOS 5 yet). iTunes on the iPad is completely reasonable...even on the iPhone. The secondary device myth is just a myth.

Collateral damage

First off your iPad syncs remotely to iTunes. You don't need to plug in it, for it to be managed for most things. I can't imagine trying to do things like reorganize my contacts using the iOS interfaces, or figure out which versions of a song I have where.

Second. Apple is expanding the management interface well beyond iTunes with software like Profile Manager, iCal Server with WebDav file share... The flow still is your iPod/iPhone/iPad is managed from the desktop system.

That being said, I agree that there is the possibility of collateral damage in terms of learning to program. There are a lot of things about the iPad which are accidentally quite complex

1) Apple is focused on performance, thus objective-C which is not a great first language.

2) The Cocoa framework is complex and assumes a lot of background in the documentation. Not kid friendly.

3) Apple assumes a multi computer household for development. That's true of professional developers that may be true of all their customers. Particularly now that Tim Cook is moving them down market to go after the top end of the feature phone customers.

In terms of doing something about it....

Create more educational programming language environments that comply with Apple's guidelines. That really isn't hard. It is very easy to create an educational programming language that does not endanger the entire system. I don't think it would have been that hard to make Scratch safe enough to pass Apple's standards. It just takes someone willing to be the primary on this.

And what is nice about Scratch is if people really want to use these devices (which I still don't think is a good idea) then we need languages which are constructed by drag and drop.

I have never managed my iPad

I have never managed my iPad from my desktop system. I guess some people do, I don't. All my calendar appointments come from our Exchange Server, email from the email server, music from the iTunes app, apps from the App store app, where does the desktop system play a role in this world where everything is in the cloud anyways? My desktop is completely irrelevant to my phone or tablet these days.

I don't think you understand Apple's guidelines very well. It used to be that there was no room for interpretation at all nor any kind of compilation from a language other than what Apple supported directly. They relaxed this to support projects like Mono and Lua in games, but the scripts MUST come from the app, they have to be installed with the app, they can't come from a non-app vetted source. This means you can't share scripts between users, you can't create a showcase for scripts for people to download, you can't allow users to edit scripts inside an app. Scratch is completely disallowed, there is no way to make it "safe" because its concept is disallowed almost explicitly.

Drag and drop is not appropriate for touch-based systems but that is another issue (works better for the mouse). Anyways, we are going to do this no matter what...there are plenty of tablets out there that aren't iPads.

management

-- All my calendar appointments come from our Exchange Server

That is desktop management. Exchange is unifying your multiple calendars and providing a simple connection to iPad. Your iPad is acting as rich terminal client.

-- This means you can't share scripts between users, you can't create a showcase for scripts for people to download, you can't allow users to edit scripts inside an app.

I understand the rules. I also understand the enforcement is grey.

I don't know what to tell you. ND1 most certainly has intra user code sharing and in application code editing. For example they publish RPL solutions to the Euler problems that you can download to your phone/calculator to play with. The application itself has an upload feature to a server or you can share off by email. I haven't played with Gambit but it has all sorts of installable libraries and a "dumping ground" for code sharing. The Basics similarly have these communities.

I know the Gambit Scheme people had to work the process to get through. They didn't get rejected and throw their hands up, they addressed Apple's complaints and Apple compromised.

The truth is out at last