At a tipping point: More professionals are reaching for the 10-bit display precision.
Finally, 10-bit imaging has come of age. A long time in the making, the technology’s newfound maturity is sparking interest that extends beyond its past niches and across a range of professional applications. To what can we attribute 10-bit technology’s rise in popularity? A confluence of recent developments in both the graphics and display industries has created a tipping point, one that is seeing an improved generation of 10-bit technology not only entrench itself deeper into existing markets, but spill over into new spaces, as well.
Medical applications demand an increasing mix of 2D and 3D images, in both 10-bit grayscale and 30-bit color. Above, such imagery is being viewed on a Barco Coronis Fusion 6MP DL diagnostic display.
Offer customers a choice between 10 and 8 bits of precision, and with all else equal, they, of course, will select the higher option. And that explains the biggest stumbling block to 10-bit acceptance: The two have not historically been anywhere near equal, not when it comes to price or availability.
Today’s de-facto 8-bit standard—yielding 256 shades of grayscale, or 16.7 million shades of color (256 shades of red, green, and blue)—has proven more than adequate to address the needs of mainstream applications. The computer industry has served that majority well, building a broad set of commodity products—displays and graphics, hardware and software—all in support of 8-bit systems.
But not so for 10-bit. Professionals with the need to push beyond 8 bits have faced a host of shortcomings in the marketplace, with options few and prices high. As a result, only a few of the most demanding, absolutely-must-have users could justify the deployment of 10-bit grayscale or 30-bit (3x10-bit) color. The medical imaging community is among this group. Delivering four times the number of displayable shades as 8-bit data, 10-bit precision improves radiologists’ speed and, more importantly, the accuracy in reading high-fidelity scan source from a range of technologies, including X-ray, magnetic resonance imaging (MRI), and computed tomography (CT).
Additional discrete shades can be allocated to fit a narrower luminance range, dedicating more displayable intensities to precisely that range where subtle details are concentrated, often varying by image and scan technology. The image reader sees more detail and spends less time twiddling with windowing, leveling, brightness, and contrast. The end benefit to the physician, hospital, and patient: more precise reads delivered in less time.
While the medical community’s been the most committed and consistent consumer of 10-bit display technology, it’s done so in spite of what the market has offered, not because of it. Medical installations adopted 10-bit technology because of the overwhelming importance of getting an accurate diagnosis quickly, and in pursuit of that goal, it has had to accept more than a few drawbacks.
Niche markets typically attract niche solutions—often just a handful and usually proprietary. This had been the case for 10-bit imaging, where proprietary solutions led to high prices, few standards, and little in the way of interoperability. Avoiding a single-source supply often was simply not possible; to piece together a working end-to-end solution was taxing enough, let alone to attempt something heterogeneous. Being locked into a proprietary solution translated into long product cycles, yet it came with no guarantees of long-term support. And should the desire ever arise to take advantage of attractive technology advancements developing beyond the bounds of the current implementation (think of the explosion in capabilities and performance of 3D graphics hardware), well, they were simply out of reach for the 10-bit user.
Fortunately, the medical imaging community no longer has to go it alone in supporting 10-bit technology, as help is coming from other professional imaging arenas, including one that’s both well funded and highly motivated. Hollywood’s quest for ever-superior image quality is leading to the adoption of 30-bit color precision, raising awareness of the technology and stimulating the development of more capable, interoperable, and cost-effective products and technologies.
Hollywood Goes 30-bit
The film industry has surely and steadily been making the move to 100 percent digital workflows—end-to-end and at every step along the way. A linchpin in realizing that vision, 30-bit color not only delivers superior fidelity, but also has been instrumental in achieving that elusive goal of uniform color, from one artist’s desk to another, throughout the production pipeline.
Nevertheless, the term “film” is becoming a misnomer, as that end-to-end digital process now extends all the way to the theater, with digital movies already being shown on a new generation of digital projectors replacing stacks of film cans. Now in use at thousands of theaters worldwide, digital cinema has arrived and should gradually replace traditional film projection.
Digital mammography has been a driving force in the push to 10-bit imaging.
To meet or exceed the visual experience of 35mm or 75mm film, digital cinema proponents and adopters set a quality target of 4k (4096 pixel)-wide spatial resolution and 10-bit precision per channel. That target has been met with a range of 30-bit capable, 4k projectors now on the market, including Sony’s SXRD product line.
To a lesser degree, 30-bit color is making a mark in broadcast markets, as well. While the transmission of higher-precision video streams to consumers is still a long way away (if ever, for a variety of economic issues rather than technical limitations), 30-bit has found its way into the content creation side of the industry. Keeping the content at the highest quality for as long as possible in the production process—through blending, mixing, and overlays—is always a good thing.
Medical imaging may have been the cornerstone for 10-bit display precision, but it is the film industry’s adoption that has spurred development and expansion of a more cost-effective and interoperable 30-bit color ecosystem. Championed by high-profile studios, with support from the computer graphics and display industries, momentum for 30-bit color has been building along three fronts: the establishment of standards to improve interoperability, the dramatic improvement in the availability and pricing of 30-bit displays, and the emergence of comprehensive high-precision support in graphics hardware.
Emerging as broad-based unifying standards for the storage and transmission of 10-bit image data, OpenEXR and DisplayPort, among others, are providing the glue that allows heterogeneous, multi-vendor 10-bit solutions to take hold. In the absence of a formal 10-bit image format from the computer industry’s de-facto standard, Windows, Industrial Light & Magic’s (ILM’s) OpenEXR standard volunteered to fill the gap. Adopted from the “half” data type of Nvidia’s Cg shading language, OpenEXR’s 16-bit floating-point format provides high dynamic range in a compact format, with a 10-bit mantissa to sustain 10-bit fidelity throughout the rendering and postprocessing workflow.
With 24-bit color as the de-facto standard for most display markets, video interfaces haven’t formally supported 30-bit pixel formats. Yes, vendors can and do build 30-bit display systems around DVI, yet DVI has never officially specified a 10-bit-per-channel format, forcing providers to specify their own formats.
The emergence of Video Electronics Standards Associations’ (VESA’s) DisplayPort should improve the status quo. Showing promise as a unifying video interface not only to replace both VGA and DVI, but also to provide a cleaner path to HDMI, DisplayPort formally specifies a 10-bit standard format (as well as 12- and 16-bit formats shared with HDMI 1.3). DVI, on the other hand, did not. With a breadth of support across markets, DisplayPort should provide a lower-cost 10-bit alternative to SDI (and HD-SDI), an existing standard (ITU and SMPTE) confined today to high-end professional video applications.
A 24-inch, color-space calibrated, 30-bit color LCD monitor from HP, created in partnership with DreamWorks, makes color differences apparent.
LCDs, Projectors, and GPUs
Until recently, CRTs ruled the 10-bit display world. Early LCDs couldn’t match the CRT’s affordability, nor did they measure up on critical metrics, such as luminance and contrast. Yet, just as the LCD has supplanted the CRT in virtually every other arena, it has done so, too, in the most demanding professional spaces and for both 10-bit grayscale and 30-bit color applications.
LCD technology, display quality, and volume manufacturing have improved to the point where CRTs are typically no longer considered for medical imaging applications. The LCD has even dethroned the CRT in its role as the de-facto standard for broadcast-reference monitors. Barco, a longtime leader in professional display products, recently introduced the first LCD monitor to achieve Grade 1 (EBU), a set of demanding specifications for brightness, contrast, and blackness.
Similar to its rise in the consumer space, the LCD’s momentum in professional markets stems from its superior form factor and the dramatic reduction in price over time. Witness Hewlett-Packard’s DreamColor LP2480zx display, a 24-inch, “color-critical” LCD monitor designed in partnership with DreamWorks Animation. The LP2480zx not only displays 30-bit color images (10-bit per channel), but it can do so with custom presets calibrating the display to the gamuts of seven-color spaces, including sRGB, ITU Rec. 709, and Adobe RGB.
With that image fidelity now available at approximately $2000, an LCD monitor like HP’s will not only allow studios to realize that color-consistent, end-to-end digital vision, but it will open a lot of eyes in spaces that haven’t historically employed 30-bit color technology. And it’s not just the advent of high-quality, 30-bit capable LCD monitors on the desktop that promises to change the playing field for higher-precision imaging. Rather, it is—perhaps even more so—the emergence of the high-resolution, 30-bit LCD projector, engineered to deliver better-than-film quality to the movie house.
Of course, the display is only half the equation; 10-bit capabilities also must be up to snuff in graphics hardware. Fortunately, advancements in GPU technology in recent years have delivered just that, starting with the adoption of top-to-bottom 32-bit floating-point pipelines. Working off the 32-bit architectural foundation, Nvidia and AMD have both extended precision—10-, 12-, and 16-bit integer, as well as floating-point formats like OpenEXR’s—through the GPU’s back-end paths and out to the display.
Early support focused on true 3x10-bit DACs and dual dual-link DVI ports, but today’s GPUs are not only capable of driving 30-bit digital display streams across DVI to support today’s platforms, but they now comply with the first truly interoperable 10-bit standard, DisplayPort. And we’re not talking a few pricey, niche models, either. Support for 30-bit color over DVI and DisplayPort is a feature now spread across the breadth of professional graphics product lines.
All the appeal of recent improvements in 10-bit quality, cost, and interoperability won’t be limited to new markets, as existing ones will reap the same benefits. Consider the potential impact of DisplayPort in studios, for example. Offered a less-costly, broad-based alternative to 10-bit SDI, more customers should be able to justify further deployment of 10-bit capable systems.
And with the scope of medical imaging continuing to expand through the spread of picture archiving and communication systems (PACS), IT buyers will certainly be pleased to find their choices wider and their prices lower, opening opportunities to improve the existing infrastructure. Hospitals won’t necessarily equip twice as many reading rooms if solution costs are cut in half, but any dollars saved would go to good use: to upgrade existing reading stations with multiple high-resolution displays, thereby allowing more effective side-by-side analysis, or to extend PACS and 10-bit display into other corners of the hospital, including operating rooms.
Collaborative viewing is a natural fit for high-resolution, 30-bit displays.
Beyond hospital borders, a wider range of economical, 10-bit digital imaging solutions will mean that more will find their way into a physician’s individual or group practices, where the light box remains a common fixture.
Another opportunity opened up by recent improvements in 10-bit display technology is the exploitation of the synergy between 10-bit precision and cutting-edge GPU technology. Today, after all, considering an imaging solution with 10-bit display as the one and only goal is shortsighted.
Keep in mind that 10-bit precision (and full 30-bit color) is just one of many features afforded by today’s professional GPUs. Select Nvidia’s Quadro FX or AMD’s ATI FirePro, for example, and you not only get the full 30-bit color precision (with 10-bit grayscale, to boot), but a high-performance, full-featured, programmable 3D GPU—and everything that goes along with it. Such solutions offer capabilities that extend not only beyond 10-bit precision, but even beyond conventional, polygonal 3D rendering.
Consider simple 2D pan and zoom, seemingly trivial graphics operations in this age of elaborate 3D special effects and immersive gameplay. With scan resolutions today reaching 2kx2k (and beyond), reading medical images without fast, hardware-accelerated pan and zoom can waste a lot of valuable time in the reading room. Yet, many of the proprietary solutions historically offered to medical applications have lacked even this most basic of hardware features.
With today’s professional GPUs, high-speed 2D zoom and pan come free and fast, trivial cases of the more complex non-affine perspective transformations common in 3D rendering. And their onboard, programmable scaling filters process more samples with more precise filter responses, yielding high-quality, high-resolution images in real time.
While display quality is the top priority for many applications in medical imaging, a visualization solution that can handle more than the simple 2D display of 10-bit grayscale has become a must-have. IT buyers demand maximum return for dollars spent and are anxious to build a more broad-based, flexible digital imaging solution, thereby unifying support for the display and rendering of imaging data generated by a range of scan technologies in both 2D and 3D.
For example, companies serving medical imaging markets—such as visualization and analysis software provider Vital Images and Mercury Computer Systems—have exploited the Nvidia Quadro FX cards used primarily for fast 2D imaging, having them also serve double duty as real-time 3D volume visualizers.
Similarly, Barco Medical Imaging is utilizing the AMD ATI FirePro cards for fast and accurate representation of raw CT data and more. With today’s advanced CT scanners producing scans of 1mm per slice, volumes can run into the thousands of high-resolution slices—far too many for a physician to navigate effectively in 2D. Add into the mix emerging scan technologies like positron emission tomography (PET) and the increasing use of 4D visualization (3D volumes varying across time), and high-performance volume visualization has moved from nicety to that of baseline requirement for radiology departments.
PACS, 10-bit, 3D rendering, multi-modal volume visualization, blending high-quality grayscale and color: The technologies go hand in hand with today’s increasingly integrated medical imaging solutions. By combining best-in-class 3D GPUs with 10-bit display precision and innovative software, professional graphics products seemingly can handle it all.
Effective seismic analysis, using data such as this from Halliburton Landmark Software and Services, must detect subtle gradient changes and reiterate analysis in high precision.
Medical imaging provided the foothold, and through the promotion of standards and wider market opportunities, film production has helped push technology and accessibility while driving prices down. With the advent of a wider, more mature, cost-effective base of products and technology, more application spaces are primed to take the plunge into the 10-bit precision waters.
Which ones? Well, with fewer limitations to accept, and fewer dollars required, the technology has truly become open game for any application that simply cannot compromise on display precision. Yet, a few candidates particularly stand out: oil and gas exploration, CAD, collaborative viewing, digital content creators (small to midsize), and surveillance/computer vision.
Sharing the medical community’s need for fast and accurate image analysis is the oil and gas industry. Collecting 16 bits of data per seismic sample is now the norm, providing the critical precision to allow detection and refinement of the subtle gradient changes that might signal a rich find and to more effectively filter out geologic anomalies leading to dead ends. With the interpreter’s eye being a critical link in the iterative process of isolating areas of interest, refining geometry, tweaking filters, and re-visualizing the display of 16-bit volume data with 10-bit precision represents a logical and valuable progression. In an industry where misinterpretation of image data could mean tens of millions in wasted dollars, any premium demanded by 30-bit color technology is trivial by comparison.
In the CAD realm, design staff shouldn’t expect to see an across-the-board transition to 30-bit color. With 24-bit color more than adequate for the purposes of engineering, 30-bit won’t have a broad impact, at least not anytime soon.
But for CAD stylers, those shaping and selling a product’s look and feel, the story is different. There’s no compromising on realism and fidelity, making 30-bit upgrades for stylers’ desks more common, while presentation rooms and auditoriums are a natural fit for the new generation of 4k, 30-bit projectors.
And demo rooms are just one example, as digital cinema projectors are finding appeal well beyond the theater. Consider wall and mega-display installations, such as reality centers, simulators, and CAVE environments, in applications like aerospace, geoscience, and academia. For such immersive display applications, one 4k projector can replace several lower-resolution projectors, eliminating the hassle and artifacts of edge blending and refresh synchronization. Complementing the 10-bit capable display with 10-bit graphics hardware then becomes a very appealing—and affordable—next step.
Big-budget Hollywood media production got the ball rolling. But 30-bit color’s appeal in DCC has not been limited by the size of the studio, but by the size of the budget. Image quality and consistency are paramount. Offer smaller and midsize DCC businesses a financially feasible 30-bit option, and they’ll make the move, as well. Graphic artists, ad agencies, videographers, small businesses, or sole proprietors—all are potential 30-bit users, and that prospect hasn’t been lost on vendors like HP. Looking beyond Hollywood borders, HP, for one, is seeing the aggressive pricing of its new 30-bit LCD monitor as a game-changer: a “courier killer,” eliminating the costly practice of producing and shuttling high-quality prints to clients, or even as a new tool for the weekend wedding photographer.
Additionally, 10-bit precision has been incorporated into the baseline feature set for most image sensors, A/D converters, and DSPs targeting surveillance, security, and other complex image analysis applications. And as in oil and gas exploration, complex image analysis with 10-bit precision will mean quicker positive identifications and fewer false ones.
While the technology is not for everyone, it is a natural step for demanding professionals. Yet, 10-bit display precision isn’t in the foreseeable future for most mainstream applications. After all, 8-bit-per-channel color is pervasive, cheap, and entrenched, and the eye’s ability to discern shades isn’t getting better. So, there’s little doubt that moving from 8-bit to 10-bit in fact offers diminishing returns.
But for applications that can’t compromise on image quality, any return is valuable, diminishing or not. With historical stumbling blocks withering away, 10-bit technology has moved beyond the radiologists’ digital light box and into higher-profile spaces, like digital content creation and visualization.
Costs are down, and choices and compatibility are up. Presented with improved technologies, engineered into a wider range of affordable, interoperable hardware, more professionals in more spaces will soon be sampling the benefits of 10-bit precision.
Alex Herrera is a senior analyst with Jon Peddie Research and author of the “JPR Workstation Report” series reference guide for navigating the markets and technologies for today’s workstations and professional graphics solutions. He can be reached at firstname.lastname@example.org.