What will Apple's move to Intel mean for Alternative Language Communities?

In light of yesterday's big news from Apple, I am left wondering what will become of the many fine language implementations that we have seen emerging under OS X?

Will the transition present problems for PLT Scheme, Haskell, Frontier, J, Croquet, FScript, etc....?

Will the various vm engines and self-hosted native code generation capabilities survive with a few minor tweaks and a simple recompile or are they predicated on the PowerPC architecture itself forcing language designers back to the blackboard?

Finally, from a purely technical perspective vis-a-vis the chip sets in question, was this move a stroke of genious, a case of 'worse is better', a bad idea, or an overall wash?

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Given that the architecture already exists..

People who have invested a lot in native compilation to PPC will not love the move. I doubt this means many people, given its small market share.

Croquet, PLT Scheme and Haskell already have x86 versions. The Apple move will only motivate developers to target x86-64 with their languages.

It may mean less effort is pu

It may mean less effort is put into ppc ports, instead focussing on X86-64 IS ports. Apples move is not going to remove all the ppc machines out there now, nor all the ppc machines Apple will sell in the meantime.

That means maintaining the PPC ports to keep the current users happy or dropping them and the users. Apple hardware is an expensive investment for anyone and given the amount of time some architectures hang about, I doubt ppc will go away anytime soon. No need to drop it just yet.

As for the decision, I was horrified at first but after reading some of the speculated reasons for it changed my mind. They needed a supplier with volume, scalability and lower heat generation. IBM couldn't deliver, so they found someone who could. Apple will probably sell a lot more hardware now because they have more supplies and access to more customers. Sadly, they won't sell OSX for generic X86 and will only support their custom hw. That's not a pl issue though.

Chris C

Disbelief

Danx: Sadly, they won't sell OSX for generic X86 and will only support their custom hw.

Yes, I know Phil Schiller said this. I still don't believe it for a single moment. The news from Steve Jobs' keynote wasn't that Apple is moving to x86; it was his "offhand" comment that Mac OS X has been running on x86 without interruption for five years. The audience for that comment wasn't Mac developers at the WWDC; it was Steve Ballmer up in Redmond.

Think about it for a moment: you can download a bootable ISO with Darwin on x86 today. Mac OS X has been running continuously on x86 for five years. Mac OS X 10.4 finally introduced supported, documented, reliable APIs for kernel extensions, the layer most concerned about interacting with hardware. Now Apple is going to x86.

If Apple management is incompetent, it will fall to someone else to get Mac OS X86 running on generic Intel hardware. If Apple management is competent, they'll do it themselves. Apple management isn't incompetent.

OSX: A large factor in Apples' Identity

Yes, I know Phil Schiller said this. I still don't believe it for a single moment. The news from Steve Jobs' keynote wasn't that Apple is moving to x86; it was his "offhand" comment that Mac OS X has been running on x86 without interruption for five years. The audience for that comment wasn't Mac developers at the WWDC; it was Steve Ballmer up in Redmond.

I agree but Apple have to walk the tight rope between generic x86 manufacturer and custom high-end well-designed stylish manufacturer. The x86 market is ruled by companies who put out x86 machines at low build cost and moderately ok quality (Dell, etc). Apple on the other hand competed with their own unique hardware and quality with high costs for those who were willing to pay for it. Now they're treading in that other pool and the game is going to be different.

That's why I don't think we will see it any sooner than 2 years (their transition period). OSX is a big factor distinguish them from the rest of the x86 market. Their build quality and style will be another, but maybe less so now it's not a completely different architecture.

Hopefully Apple will open Mac OSX up to the wider market though. I'd pay for it and leave Linux behind, in the same way I left Windows. Lots of people would do the same. One good thing would be forcing M$ into competition. Linux doesn't compete well on that front (yet, but it's getting there), but OSX could make significant progress. Even more so given Apples marketing abilities.

On the pl side of things, Cocoa is raved about by my Mac friends. I've never actually used the api, but people seem to think it's much better than other APIs. Be nice to play with this without forking out for a Mac.

Chris C

GNUStep

On the pl side of things, Cocoa is raved about by my Mac friends. I've never actually used the api, but people seem to think it's much better than other APIs. Be nice to play with this without forking out for a Mac.

Perhaps you could look at GNUStep? I haven't used it myself, but it could probably give you an impression of what programming with Cocoa is like.

on the verge of a cliff

Lots of people would do the same.

lemmings! :)

Lots of people who care for freedom, choice and a competitive software market will not.

Confused

How is my choice to move to MacOSX should it be on generic X86 against that?

The fact is many people are willing to switch OS (away from Windows and Linux) but Mac OSX is tied to Apple hardware, and thus to switch you need to buy the hardware. If you could buy OSX off the shelf for the X86, there would be more competition in the home market.

Chris

No DRM then?

So you don't think that Apple will be using hardware DRM and "Trusted" computing to limit the OS to running on Apple only hardware? I'd guess that would be a major selling point, since then Apple would probably be allowed to rent out movies from the iFlicks store.

Nope...

...I don't. I don't think the computer-buying market will accept what is supposed to be a "general-purpose computer" with hardware DRM built-in.

Even supposing that I'm wrong, however, I promise you that someone will find a way around it. The only such systems that, AFAIK, haven't been hacked have been standalone tamper-resistent chips such as those from Dallas Semiconductor.

Yep...

I don't think computer users even know what the f*** DRM is. They won't even notice when they buy it, except for being unable to make copies of copyrighted stuff. But then, it'll be too late and just one step away from "Big Brother"...

IA-32

It may mean less effort is put into ppc ports, instead focussing on X86-64 IS ports

According to the Universal Binary Programming Guidelines the ABI is IA-32.
This is a pitty, because there will probably be another transition to 64 bit.

Bleah

Bleah. Not too surprising, though, since they intend to start using the x86s in their low-end machines.

It will affect independent language communities, yes

All big transitions like this affect small developers. It took a shockingly long time for many developers to drop MS-DOS and move to Windows, for example. Even today you can still find language tools (like the A86 assembler) and compilers (quite a few Forths), that never made the transition. So I'd say the people working on Mops (a PPC native OOP Forth) and Macintosh Common Lisp are in a bit of a tizzy right now. Almost certainly a large number of Macintosh applications and tools, which includes some language systems, will stagnate and rot. (I was a Mac developer back during the 68K/PPC transition, so I saw this happen.)

That aside, I'm disappointed to see personal computers become essentially homegenous in regard to CPUs. Compiler writers love the PowerPC. It's a classic RISC style chip, the kind covered in compiler texts, where the x86 is more of a degenerate worst case that's targetted more out of necessity than anything else. Too bad.

Single-Platform Codegen

James Hague: Almost certainly a large number of Macintosh applications and tools, which includes some language systems, will stagnate and rot.

Yeah, I was just thinking that this is bad news if you've put a lot of effort into a single codebase with a custom codegen, as I believe OpenMCL has. It's obviously not an issue if you already have decent x86 codegen on some platforms and already run on Darwin and have some level of FFI to Carbon or Cocoa, as I believe SBCL with Thomas Burdick's callback patch does. So while SBCL currently doesn't have the level of Mac OS X calling that OpenMCL does, I suspect that will change as a consequence of Apple's decision.

Of course, for alternative languages that currently do nothing special for Mac OS X but have codegen for PPC and x86 and a generally good FFI, e.g. O'Caml, the appropriate reaction is a shrug: as long as we know what the ABI is and the FFI can target it, it's really of comparatively little import what processor Mac OS X is on, as we have the hard work of writing all of the necessary FFI stuff to use Carbon or Cocoa in either (or both) cases.

PPC is hardly dead...

Well... Macintosh Common Lisp is still of value, and the time and effort doesn't have to be wasted. The developers just have to realize that their market has shifted from desktops to multiprocessing game consoles, cell phones, and telecom hardware.

Several games for the Playstation used C for graphics and Lisp for most everything else, Crash Bandicoot being the best known example. This market is a more demanding than desktops for sure, but on the other hand Sony was very late in getting decent Playstation 2 development tools out the door. This rendered the PS2's USB bus nearly pointless, as few games support USB well, if at all.

Clarification

A Lisp-like language was used for gameplay scripting in the Crash Bandicoot games. It was not used for everything except graphics.

You may be right, but...

Raising the Paradigm of Video Gaming from Franz site:

NaughtyDog used Lisp and Allegro CL to create many of the tools they needed to design the fluid and precise moves that the characters in their games make, as well as the rich, textured and crisp graphics. The Naughty Dog team was able to develop over 500 different types of game objects, each with uniquely crafted and tuned gameplay and visual characteristics.

Naughty Dog co-founder Andy Gavin, says the unique capabilities of Lisp enabled fast development and execution of character and object control – something that was needed to fully realize the numerous 3D creatures and devices which interact with the player in real-time (60 frames per second).

But still, the game isn't written in Lisp

They used Lisp to write their own custom language, which is a fantastic way to go. But Naughty Dog still can't be trotted out as an example of writing games in Common Lisp (which is what seems to always happen). Ditto for the old game "Abuse" being written in Lisp (it just used a custom mini-Lisp interpreter for simple scripting, a minority of the game code).

Business model

Danx: Sadly, they won't sell OSX for generic X86 and will only support their custom hw.

Paul: Yes, I know Phil Schiller said this. I still don't believe it for a single moment.

I do. I don't see how changing CPUs will change Apple's basic business model. They have always been about a closely bundled system, hardware + software.

Besides they don't have the resources for deploying on generic x86. They would have to write most of their own drivers. Not going to happen.

Business Model Shifts

John Eikenberry: I don't see how changing CPUs will change Apple's basic business model. They have always been about a closely bundled system, hardware + software.

Always, until recently, sort of: QuickTime has been available for Windows since 3.0. WebObjects runs on a variety of platforms. iTunes and the ITMS are also available for Windows. The iPod's OS was purchased, not developed by Apple.

On the hardware front, Apple has been going increasingly in the direction of stock chipsets and protocols: ethernet controllers, USB, IDE/ATA disks, etc. This is just smart as a cost-containment measure, but it has the (coincidental?) side-effect of also driving the platform to look more like any other PC out there.

John: Besides they don't have the resources for deploying on generic x86. They would have to write most of their own drivers. Not going to happen.

I must have missed something: Apple Computer, Inc. very definitely has the resources to target at least half a dozen popular Intel- or AMD-based motherboards, distributes Darwin for x86 today and has been for a while, and as already noted has kept Mac OS X running on Intel-based systems continuously for the last five years. Those aren't Apple Intel boxes they're talking about. So where's the difficulty? And didn't Jobs also say that in the end, the heart of the Macintosh is the operating system?

Given Jobs' successful entry into the MP3 player and digital music market—that is to say, the consumer electronics market—I think it's premature to say "never" about Apple finally releasing a genuinely portable Mac OS, especially when you're seeing switcher comments like this.

I think you're overstating what Apple has been doing

They've been running OSX on at least one intel system for the last few years.

That's a far cry from supporting all or even most intel systems. And once Apple puts a box with an unlocked OSX on the shelf at best buy, that's exactly what they'll have to do. And they'll be stuck writing the drivers and doing the testing themselves, 'cause they don't have enough market share for the vendors to do it.

Nope. Apple is only going allow OSX to run in a tightly controlled little world. They may not make hardware any more -- I wouldn't be surprised if Intel winds up shipping populated motherboards to Apple -- but there's gonna be a way to make sure that Apple can reliably identify it's own systems, cause Apple can't handle the hit to their reputation.

objective-c

hope it wipes out C++ for good. :)

Objective-C

From what I have seen there is more C++ than ever in OS X. Maybe it has just been the projects I have worked on but to me it looked like C++ was being used for most new code.

I haven't looked at Objective-C since the NeXT days.

  • Does it support (template) meta-programming?
  • Does it still have the massive method call overhead? Last I saw it required (multiple?) pointer deferences and a hash table lookup.
  • Can it inline method calls? Since they are essentially virtual methods I would think inlining would be difficult. ☠

re: Objective-C

Does it support (template) meta-programming?

It supports the Smalltalk model of message oriented programming giving it a fairly rich set of Collections. In a dynamic programming language like Objective-C, much of the standard templates are taken for granted. Yes, there are things you can do with Templates that are quite nice (Boost FP). But there's also a lot you can do with the Smalltalk model that is not particularly easy in C++. As in all language decisions, there are trade-offs.

Does it still have the massive method call overhead? Last I saw it required (multiple?) pointer deferences and a hash table lookup.

All methods are virtual. C++ object methods that are virtual have the same overhead. C++ methods that are not virtual (final) are faster but have the problem that they are no longer extensible (violating the Open-Closed Principle).

Can it inline method calls? Since they are essentially virtual methods I would think inlining would be difficult.

Probably not. But if your code is worried about the overhead of an indirect call, you probably need to drop down into C in either case. Of course, on the other side of the equation, we have to ask whether C++ supports Class Categories - a very useful feature for extending classes and promoting re-use.

And as for NextStep, it's a shame that it's not more pervasive as programming user interfaces would have not been such a mess as it is with the other pervasive user interfaces.

Bingo

In a dynamic programming language like Objective-C

There's the difference. Dynamic programming languages are fine; but they occupy a different niche than C/C++. Expecting Objective C to displace C++ is like expecting Coke to displace peanut butter.

Much overlap though

There's an open question in anthropology about whether members from one culture/civilization can truly understand those in another culture/civilization. The anthropologists will say that it is quite difficult, but it can be done through methodical study (requiring one to overcome preconceived notions).

Well, from a PL standpoint, it's very hard to understand the relative advantages/disadvantages of languages with respect to each. The types of problems that they excel in solving are different, and what's taken for granted in one is all but ignored in another.

It would be nice to say that languages that are designed from distinct perspectives do not overlap or compete with each other. The problem is that there's lots of gray areas where the two overlap. And PLs are notoriously greedy in that users always end up using them beyond the range they are optimal. Given that users are familiar with the language, they are willing to live with the warts outside of that range. And their perception of programming solutions is colored by the language culture they are immersed in. On top of that, the PL greed is furthered by the fact that few languages care to play nicely with each other.

From my perspective, the original poster is comparing the languages at the wrong level. Simple lists that are more based on optimization or technique don't do justice in setting up a scale to judge the relative merits of C++ and Objective-C. PL comparison is not something where you can have feature lists that can be checked or unchecked to determine which language is best. Things like inline optimization or vtable overhead are irrelevant in a large of number of languages.

Languages playing nice

Agreed!

I found one part of your comment especially interesting, because it seems to suggest there may be a technical way to reduce "anthropological" misunderstandings.

On top of that, the PL greed is furthered by the fact that few languages care to play nicely with each other.

Do you have any examples of programming languages that "play nice" with each other? I can think of some VM-based approaches, where each languages assumes a common execution model, like Java, .NET, SEAM. Maybe a few systems to allow remote calls like SOAP and CORBA. I suspect there are a lot of other integration mechanisms that I'm not aware of...

C-style FFI

Do you have any examples of programming languages that "play nice" with each other?

I believe, any PL with an FFI "plays nice" with C calling conventions, at least.

Unix Pipes also come to mind...

...the subject of another thread here-abouts

sure

"but they occupy a different niche than C/C++. Expecting Objective C to displace C++ is like expecting Coke to displace peanut butter."

Sure. Nobody is suggesting that you code an operating system or real-time system in Scheme, Python or Objective-C. OTOH, i find it difficult to grasp why someone would go through the pain of implementing NextStep applications in C or C++ rather than in the aforementioned...

Scheme as a system PL

Nobody is suggesting that you code an operating system or real-time system in Scheme, Python or Objective-C.

What's wrong with Scheme? Garbage collection? What about PreScheme?

problem is

the same with Python or OCaml: if you want performance, you either rely on compilers that greatly restrict most of the dynamic features of the language, or you just end up coding in a more imperative style... and also restrict yourself from using most of the dynamic features of the language.

and yes, Python still don't have anything like that, so you're on your own, using lots of local variable referring to occasional deeply nested structures during loops or simply having lots of shallow data structures...

and yes, OCaml is a statically typed language, but it is functional and suffers a bit from that and not being lazy by default...

NeXT's IOKit

"Nobody is suggesting that you code an operating system [in] Objective-C."
Well, maybe not today, but because of Objective-C's dual nature, it was for a time considered to be a decent choice for coding an operating system. You would use its C subset directly for the performance-sensitive parts, and use Objects where performance was less critical or where they help with stabilizing interfaces. NeXT's IOKit was an example of the latter usage.

FWIW, Apple has since converted the IOKit Objective-C interfaces to use a subset of C++. See Apple's I/O Kit docs, including this section about Language Choice, which brings up a whole different issue. (How sensible is it to use very loosely defined language subsets?).

Objective-C++

Your right. You've probably seen a lot of C++ code, as they can mix that code with Objective-C in the same files, I believe, and compile all code together. I know I read about this recently, but, and I apologize for this, I can't recall at the moment where... I really need to get my bookmarks organized....

Short term vs long term

In the short term, the effects may well be minimal. It's not like all those Apples running on PowerPC chips will go away or stop working; nothing prevents the developers from continuing to develop. Assuming the PPC line is supported for the forseeable future, there is plenty of time for transition to occur.

On the other hand; there are plenty of languages which are dead for the reason that no implementations exist on modern hardware. In some cases, the languages make design assumptions (or are targeted to) a particular architecture, and a port would be inappropriate. In other cases, there is no technical reason that the language couldn't be ported--except the compiler was written in assembly, or has no source, etc. Of course, in many such cases there is little interest (beyond historical interest) in making the port; so the loss is questionable.

That said, numerous interesting languages exist which aren't supported on all of the major platforms in use today.

GHC runs on Mac OS X / x86

Just thought I'd mention that Glasgow Haskell already runs on Darwin/x86, and someone at the Apple conference has even tried it on one of their Mac OS X/x86 development boxes and verified that it works.