I'm gathering that there's still a fair amount of confusion over the application domain of Cg. Cg is a compiler for a somewhat C-like language (this is the part that people know) that targets Graphics Processing Units (GPUs) rather than CPUs (this seems to be the confusing part). Modern GPUs such as nVidia's GeForce 3 or ATI's Radeon 7500 allow developers to write what are (confusingly, to me) called "shaders." The reason I find the terminology confusing is that these shaders can actually add quite a bit of "featurefulness" to a scene themselves. For example, a shader might add a bumpmap to an object in a scene, lending it quite a bit more textured appearance than would otherwise be the case. Similarly, a shader might add environment mapping or lens-flare-like effects.
This level of capability used to be the exclusive province of non-real-time rendering with tools such as Pixar's Renderman, which has its own high-level shader language. With the advent of modern GPUs, the capability now exists for certain classes of real-time rendering, but developers have had to write assembly language for the GPU in question, load that code into the GPU, and tell the GPU how to apply it to the scene. Cg, then, removes the need to do all of this in assembly language. But it is still a shader language for GPUs, so certain architectural decisions of the language, such as the lack of pointers, aren't made "to make the code run faster;" they're made because there's no shader-accessible RAM on the GPU to point to. Once you understand that your Cg code runs on the GPU and not the CPU, the architecture and constraints make perfect sense. So if you have an interest in how a modern shading language works, or how a modern desktop graphics pipeline works, by all means check out Cg.
I also think that with movies going for $9/ticket here in LA these days and a satellite dish subscription costing ~$75/month, Hollywood needs to start worrying about computers having high-quality video out to a nice large-screen HDTV and home stereo system. $2,000 for a nice tower plus high-end nVidia/ATI card and another $2,000 for a nice home entertainment system starts to look pretty attractive stacked up against 4x$9 = $36 + $20 for refreshments = $56x4 weeks (one movie a week for a family of four) = $224/month + $75/month (satellite) = $299/month x 12 months = $3588/year. For $418 more you get a high-definition screem, a nice stereo and speakers, a really excellent interactive entertainment rig that you can keep upgrading the GPU on, and all of it will see more use, and more varied types of uses, than your TV and movie watching can provide.
|