GPUs Unplugged
By Courtney E. Howard
Issue: Volume: 29 Issue: 9 (Sept 2006)

GPUs Unplugged

Today, makers of computer processing units (CPU), graphics processing units (GPU) and graphics cards, and graphics-oriented workstations assume as much, if not more, space and attention at industry trade shows and exhibits as the major software developers serving the content creation market. The latest industry figures likewise reflect the growing market position of these tech companies.
 
“The graphics mar­ket is just exploding, growing 20 percent year-over-year in branded workstations—about twice the rate of the PC market,” notes Jeff Brown, general manager for Nvidia professional products. “Hewlett-Packard and Dell had amazing years. And we just announced that the customer base for Nvidia’s Quadro FX product grew by more than 20 percent.”
 
Dan Shapiro, senior marketing manager for workstation products at ATI, echoes this sentiment, stating, “Right now, we see year-to-year 20 percent growth in the professional market worldwide.”
 
Despite these figures, the number of card manufacturers has decreased: Several months ago, 3dlabs exited the field, leaving Nvidia, ATI, and manufacturer/supplier PNY Technologies.

Graphics Growth

A number of factors are contributing to the increased adoption of professional-level graphics cards. For starters, industry vendors and consortia continue to invest handsomely in research and development efforts, advancing the technology and tailoring hardware and software products to better meet the needs of digital content creation professionals. “A new architecture, a brand-new graphics API such as OpenGL and DirectX, comes out roughly once a year,” explains Brown. “And the workstation side is still very much tied to a once-a-year refresh rate. With graphics processors, we’re now seeing a lot of variations of products and incremental changes to product families throughout the year.”

“The pace of innovation is incredible,” recognizes Shapiro. “New manufacturing processes enable us to pack more on the chips, and new algorithms and new techniques are being integrated into the hardware. A lot of trends in the workstation market converge in graphics; it seems to be a key focal point for these technologies. For example, 64-bit computing is much more prevalent now, with Microsoft’s 64-bit operating system and the availability of 64-bit chips. That is enabling customers to put a lot more memory in their systems and, as a result, work with much larger datasets.”

More 3D than Ever

3D is pervasive in today’s society: in entertainment, including films and games, to the Web, e-mail communications, computer desktops, cell-phone screens, PDF files, and even to your local hospital. And it is growing rapidly in engineering, oil and gas exploration, biomedical, defense, and myriad other markets and applications.

“Today, there’s a whole migration from 2D to 3D, and increased precision and resolution,” says Brown. “For example, Google is starting to add 3D models to Google Earth, which has a huge user base with hundreds of millions of people. Autodesk is moving users from AutoCAD in 2D to Inventor in 3D. A lot of ultrasound applications used to be 2D grayscale, for example, and now you’re seeing 3D color ultrasound with increased precision and quality.”

Growth in 3D content means more and larger models, which translates to more geometry, Brown notes. More sophisticated shaders and fill rate drive the requirement for more parallelism and more pixel pipelines. Higher quality output means higher pixel precision and higher bit depth. “It impacts every portion of the GPU architecture,” he says.

Identifying the growing demand for more visual computing power, Nvidia last month introduced the Quadro Plex 1000 visual computing system, a scalable offering—akin to a GPU farm—that can be utilized as a single VCS node or scaled in a rack space for unprecedented visual compute density (see Products, pg. 4).

Displays in Demand

The growing demand for eye-catching, realistic 3D has altered the DCC work flow, bringing about increased requirements for faster iterations and time to market. To boost productivity and speed work flow, professionals now often employ multiple displays. Yet, the use of two displays essentially doubles the amount of graphics horsepower required, given the need to drive twice the number of pixels. On the video side, the move from standard definition (SD) to high definition (HD), for example, is accompanied by four times the number of pixels to render.

“High-resolution displays, such as Ap­ple’s 30-inch Cinema and Dell’s 30-inch 3007WFP, are enabling professionals to be more productive, especially on the content creation side,” says Shapiro. Graphics card vendors, in response to this growing trend, have infused their product lines with support for today’s ultra-high-resolution displays with high refresh rates. For example, ATI’s professional FireGL line provides dual-link technology and a native 10-bit engine for displaying more than one billion colors. The 10-bit display engine is being embraced by various high-tech market segments, including medical imaging, scientific visualization, and others in which the ability to see extremely fine detail is critical.

Graphics card vendors, display manufacturers, PC and workstation companies, and a consortium of other graphics suppliers have been hard at work on a new standard called Display Port. “It’s about the size of a USB connector, but it has different pins on it,” describes Shapiro. “It will enable higher resolution support over twisted pair, deeper bit-depth support natively, much longer cable lengths, and the ability to drive multiple displays off a single cable. It’s a new technology that, over time, will be low cost and connect a desktop or a laptop to the next generation of display technology.”

The industry relationships don’t stop there; rather, graphics card manufacturers work closely with independent software vendors (ISV) and makers of graphics workstations and computer processors to ensure the seamless integration and compatibility of components.

Software Sophistication

It’s a whole ecosystem,” says Shapiro of professional workstations, in general. “You need to have a lot of geometry processing and all components of the workstation operating in tandem. As Intel and AMD are making advances on the CPU side and as the software companies are coming out with multi-threaded applications, we work together to remove all the bottlenecks of the system and to expand our capabilities.”

Professional DCC involves considerably large data­sets and sophisticated software solutions with which to create and manipulate them. Software developers such as Autodesk, Maxon, and Softimage continue to expand the functionality, capabilities, and tool sets within their 3D applications. These full-featured programs, in turn, increase the necessity for a powerful, high-end graphics card to accelerate them. The availability of increased processing power enables DCC professionals to work with larger models and potentially become more productive and innovative, as they are better able to create, edit, and experiment with 3D models in a real-time, interactive interface and work flow.

Given that certain 3D DCC software vendors influence the design and development of graphics cards and associated technology, imagine the impact of the largest mainstream software application: the operating system. Microsoft is poised to introduce Windows Vista, a new version of the Windows operating system that harnesses the power of the GPU to deliver impressive graphics and performance.

Apple designed its OS X with an interface based on OpenGL. In doing so, Apple built into the operating system a wealth of 3D and graphical elements—a stark contrast to the 2D bitmaps common in older operating systems. In its development of Windows Vista, Microsoft is likewise instilling the OS with advanced graphics effects and capabilities, including transparency, 3D geometry, and imagery mapped onto 3D objects. Users who wish to take advantage of the interactive, graphically rich Windows Vista interface will require a powerful graphics card.

“Windows Vista provides an interface with all the bells and whistles, things like transparencies and scalable icons, using the GPU entirely,” Brown explains. “Even in the lowest end consumer markets, people who are going to want to take advantage of that GUI are going to need a reasonable GPU. That is one of the trends, without a doubt, that is going to drive 3D graphics and GPU use in a lot of markets.”

Moving to the GPU

Virtually all segments of the graphics industry are investing in today’s 3D graphics processing technologies. “What’s interesting is that research departments and specialized market segments are finding that CPUs are not fast enough for their applications,” admits Shapiro. “In fact, our GPUs are roughly 10 times more powerful than an Intel Pentium processor in terms of raw gigaflops.” As a result, much of the market is intent on attaining high-end 3D graphics cards to harness the power inherent in GPUs. Moreover, software developers increasingly are designing their programs to tap a workstation’s graphics processing power, as opposed to its computer processing power. That is, tasks traditionally handled by the CPU are being relegated to the realm of the GPU, and translating into greater performance for the end customer.

 

More GPU power means that digital artists can more easily create and manipulate large and complex data for scenes like this one.
Image courtesy Amilton Diesel.

Shapiro raises physics processing as an example. “This speaks to the gaming market that has a lot of engineering applications whereby collision detection, cloth simulation, or other types of physical phenomenon are modeled using similar kinds of mathematical equations that are used to compute graphics display,” he says. In a demonstration at SIGGRAPH 2006, in fact, ATI illustrated the ability to accelerate Havok’s physics libraries directly on the GPU. In the on-screen presentation, a bowling ball collided with thousands of pins and elicited a chain reaction in real time. “It was a real-world simulation of thousands of objects interacting,” he notes. “When we ran it on the CPU, it ran at roughly three or four frames per second; when we ran it on the GPU, it was real time, over 30 fps.”

In addition to physics engines, programmable shaders are among the many applications being powered by the GPU. “A really great example of an application using the graphics card is CATIA,” Brown mentions. “The latest version uses CG effects for shading; you can shade your models with realistic shaders. That capability has been in the DCC applications for a couple generations now, but now there’s increased use of the GPU. The growing DI market, which includes Assimilate solutions and Autodesk Lustre, uses the GPU heavily. All these trends in the professional graphics market push virtually every unit within the GPU. It’s that ecosystem that helps to drive GPU development.”

 

Nvidia Quadro GPUs and nForce Professional MCPs are robust, high-performance workstation solutions for professionals on desktop and mobile platforms.
 

Tomorrow’s 3D Graphics Cards

Graphics card companies partner with various industry organizations, including ISVs, chip manufacturers, workstation vendors, and standards bodies, to stay up-to-date on, become educated about, and keep pace with the latest technology trends. Yet, at the same time, it’s clear that all R&D efforts begin with the end user. At present, for example, the companies are keeping their eye on upcoming versions of OpenGL, DirectX, and PCI Express, the latter of which is intended to double the bandwidth between the graphics card and the system. 

For Brown, the future of 3D graphics cards is easy to predict. “You can look at the future APIs—OpenGL 3.0 and DirectX 10—and from those forecast what the next GPUs will look like. Of course, they put out more performance and functionality, but what they really provide is a lot more scalability, increased image quality without a performance hit, the ability to handle more windows interactively, and more programmability for developers and ISVs so more sophisticated real-time effects can be run on the GPU. Those define a lot of the future features that you’ll see in GPUs, and, in turn, that you’ll see applications use, but it starts really with the content—the user need.”


Courtney E. Howard is a contributing editor for Computer Graphics World. She can be reached at cehoward802@aol.com.

 
 
GPU and CPU Manufacturers Partner
 
Graphics card manufacturers ATI and Nvidia work with various industry organizations, including vendors of software programs, workstations, and workstation components. At the end of July, just prior to SIGGRAPH 2006 in Boston, ATI revealed that well-known computer processing unit manufacturer AMD intends to purchase the graphics card company.
 
“We work closely with the CPU manufacturers, Intel and AMD,” says Dan Shapiro, senior marketing manager for workstation products at ATI. “Although an acquisition has been announced between ATI and AMD, ATI’s focus as a graphics provider is very much on supporting all the major CPU companies. We will continue to support both Intel and AMD.”
 
The acquisition transaction had not been finalized as of press time. Neither company has announced official plans as to how the new organization will be structured. Yet, Shapiro offers some insight into the potential acquisition agreement, noting that battles over architectures and a considerable reduction in staff are both unlikely. “These are two complementary companies merging together,” he says. “As concerns future product innovations, I think we’ll see a lot of crossover technologies between CPUs and GPUs, and combining them into various efficient and very powerful chipsets.” —Courtney E. Howard