Lambda the Ultimate

inactiveTopic NVIDIA Open Sources Cg Compiler Technology
started 7/24/2002; 7:01:59 AM - last post 7/24/2002; 11:07:39 AM
Dan Shappir - NVIDIA Open Sources Cg Compiler Technology  blueArrow
7/24/2002; 7:01:59 AM (reads: 1579, responses: 6)
NVIDIA Open Sources Cg Compiler Technology

(via Slashdot)

"We've experienced enormous interest in Cg since its introduction," said Dan Vivoli, vice president of marketing at NVIDIA. "We're open sourcing this compiler code to further accelerate the transition to an era of advanced real-time effects through Cg."

As reported here before Cg is a C-based programming language specification and implementation that is intended for the fast creation of special effects and real-time cinematic quality experiences on multiple platforms.


Posted to Software-Eng by Dan Shappir on 7/24/02; 7:04:06 AM

Ehud Lamm - Re: NVIDIA Open Sources Cg Compiler Technology  blueArrow
7/24/2002; 7:41:12 AM (reads: 764, responses: 0)
Is there some good reason to be interested in Cg, or is this just a comp.lang.misc sort of tidbit?

With so much going on in the PL world (and so many other things I have to get done), I need the best filtering I can get...

Thanks!

Noel Welsh - Re: NVIDIA Open Sources Cg Compiler Technology  blueArrow
7/24/2002; 9:57:12 AM (reads: 749, responses: 0)
It is kinda interesting to see what has been taken out of C to make the code run faster. I've only briefly looked over the language spec but it immediately jumped out that there are no pointers. (So instead of a stack model Cg might have a register model - welcome to the '80s!)

Paul Snively - Re: NVIDIA Open Sources Cg Compiler Technology  blueArrow
7/24/2002; 10:44:28 AM (reads: 742, responses: 0)
I'm gathering that there's still a fair amount of confusion over the application domain of Cg. Cg is a compiler for a somewhat C-like language (this is the part that people know) that targets Graphics Processing Units (GPUs) rather than CPUs (this seems to be the confusing part). Modern GPUs such as nVidia's GeForce 3 or ATI's Radeon 7500 allow developers to write what are (confusingly, to me) called "shaders." The reason I find the terminology confusing is that these shaders can actually add quite a bit of "featurefulness" to a scene themselves. For example, a shader might add a bumpmap to an object in a scene, lending it quite a bit more textured appearance than would otherwise be the case. Similarly, a shader might add environment mapping or lens-flare-like effects.

This level of capability used to be the exclusive province of non-real-time rendering with tools such as Pixar's Renderman, which has its own high-level shader language. With the advent of modern GPUs, the capability now exists for certain classes of real-time rendering, but developers have had to write assembly language for the GPU in question, load that code into the GPU, and tell the GPU how to apply it to the scene. Cg, then, removes the need to do all of this in assembly language. But it is still a shader language for GPUs, so certain architectural decisions of the language, such as the lack of pointers, aren't made "to make the code run faster;" they're made because there's no shader-accessible RAM on the GPU to point to. Once you understand that your Cg code runs on the GPU and not the CPU, the architecture and constraints make perfect sense. So if you have an interest in how a modern shading language works, or how a modern desktop graphics pipeline works, by all means check out Cg.

I also think that with movies going for $9/ticket here in LA these days and a satellite dish subscription costing ~$75/month, Hollywood needs to start worrying about computers having high-quality video out to a nice large-screen HDTV and home stereo system. $2,000 for a nice tower plus high-end nVidia/ATI card and another $2,000 for a nice home entertainment system starts to look pretty attractive stacked up against 4x$9 = $36 + $20 for refreshments = $56x4 weeks (one movie a week for a family of four) = $224/month + $75/month (satellite) = $299/month x 12 months = $3588/year. For $418 more you get a high-definition screem, a nice stereo and speakers, a really excellent interactive entertainment rig that you can keep upgrading the GPU on, and all of it will see more use, and more varied types of uses, than your TV and movie watching can provide.

Franck Arnaud - Re: NVIDIA Open Sources Cg Compiler Technology  blueArrow
7/24/2002; 11:07:39 AM (reads: 734, responses: 0)
What they are making open-source is the parser and syntax tree bit, the actual code generator remains closed source, so the 'open source' story is a bit of a joke. Next week: "Microsoft open sources windows, minesweeper.c code published!".

What I wonder more generally is whether you can do something else than graphics on a GPU, like say crypto, or who knows lambda reduction... Some people in comp.arch seem to think there are some opportunities for more general purpose use, possibly with some minor tweaks to the existing GPU architectures.