Spotlight: Silver Edge
Issue: Volume 38 Issue 5: (Sep/Oct 2015)

Spotlight: Silver Edge

CGW Selects SIGGRAPH 2015 Best of Show

On August, practitioners of computer graphics and interactive techniques once again gathered together at the annual SIGGRAPH conference and exhibition to learn, discuss, and collaborate in an effort to advance the state of the industry.

What began as a small gathering of those working in the relatively new field of computer graphics and interactivity in 1974 has since grown into the largest event serving the industry. This year’s conference theme was “Xroads of Discovery,” with humans forging new technological pathways built from the intersection of their interconnected hardware and software. 

Indeed, new technology could be seen throughout the conference and exhibition hall. This year, like past years, the editors of Computer Graphics World had a difficult time assessing the products and technologies at the show that have the potential to greatly impact the industry. So, without further adieu, here are the winners of the CGW Silver Edge Awards for best in show at SIGGRAPH 2015.

Chaos Group’s V-Ray – Chaos gets high marks for its newly released V-Ray for Nuke, which introduces a new approach to lighting and compositing by enabling compositors to take advantage of V-Ray’s lighting, shading, and rendering tools within Nuke’s node-based workflow. It gives compositors the chance to adjust lighting, materials, and render elements up until final shot delivery. More exciting, though, is the advanced VR tools within V-Ray, which give those working in the emerging fields of VR and AR a much-needed boost. For instance, V-Ray 3.1 for Maya adds new stereoscopic camera types that will render 6x1 cube maps and spherical images, while the introduction of Shade Map optimizes stereo rendering.

Fabric Software’s Fabric Engine 2.0 – When I first spoke to Fabric Software a few SIGGRAPHs ago, it was unclear exactly what Fabric Engine did. The explanation is still very techie: It is a digital content creation platform with a visual programming system that lets users build complex tools and applications for games, visual effects, VR, and more. In a nutshell, it simplifies complex tasks, providing super-fast processing within existing production pipelines. A tool like this will be especially vital as developers create VR and AR applications, where content creation, editing, and reviewing directly within such experiences will be more critical than ever. 

IKinema’s Intimate – Animation can be difficult. But, IKinema, a developer of real-time inverse kinematics technology, is trying to make it easier. The objective of Intimate (a code name for its natural language animation interface), whose release is about a year and a half down the pike, is for users to direct action via simple voice commands and written directions. The firm is converting animation libraries into a run-time rig, resulting in a seamless transition from one animation to another with simple words such as “walk, turn left, then run to the red door.” The technology, now in the prototype phase, could offer an intuitive interface for VR applications.

Nvidia’s DesignWorks – The expectation at a conference like SIGGRAPH is that Nvidia will roll out a new GPU (and it did, presenting the M5000 and M4000). This year, though, the company surprised many with a somewhat different offering: DesignWorks, a suite of software tools, technologies, and libraries that enable interactive photorealistic rendering and provide developers with easy access to physically-based rendering and physically-based materials. As a result, designers can utilize high-quality material designs and physically-based rendering on their Nvidia GPUs with little expense. DesignWorks combines rendering, materials, display technology, VR, and live video capabilities that work “behind the scenes” with the software that designers can continue to use. Among the tools/technologies included within DesignWorks are: Iray SDK, Material Definition Language, vMaterials, OptiX, and DesignWorks VR, a suite of tools for incorporating virtual reality into design software.

Thinkbox’s Deadline and Sequoia – Deadline 8, in beta, offers a number of valuable features. One of the more useful offerings is the new ability to purchase rendering by the hour via a metering license, letting users ramp up service when needed. Another impressive feature introduced recently in 7.2 is Draft, a tool that provides simple compositing functionality. Available as a stand-alone tool or integrated with Deadline, Draft lets users perform simple compositing operations on rendered frames after a render job is complete. Perhaps most impressive is Sequoia, a stand-alone application for point cloud processing and meshing, making it easy for users to work with very large production data sets.

Autodesk’s Stingray It’s been no secret that Autodesk has been focused on the gaming industry, and various acquisitions during the past few years have furthered that effort. Yet, the acquisition of Bitsquid last year led to perhaps the biggest news of all when Autodesk announced its new Stingray game engine. The engine supports many standard game development workflows, including those, of course, involving Autodesk software (as well as those that do not). In fact, the engine is tuned to take advantage of the capabilities in Autodesk’s products, resulting in efficient, optimal performance. Moreover, the engine uses a lightweight code base, letting users make major changes to the engine and renderer without needing source-code access. 

Reallusion’s iClone Character Creator and Live Motion Capture System – Character creation is a complex process, but Reallusion has simplified this with its iClone Character Creator, a real-time 3D character design application that creates fully rigged characters on the fly that are ready for face, body, and lip-sync animation. The software enables users to build custom characters with dynamic morphs, ethnicity, aging, skin, conforming cloth, cosmetics, and fabrics. The company’s iClone Live Motion Capture System also simplifies a complex process, allowing users to import any character rig, which it then “characterizes” to iClone so the motion performance can be seen on the rig, in real time during the performance, thus reducing capture time and editing. The Live Motion Capture System is a result of a partnership between Reallusion and Noitom, makers of the Neuron mocap suit.

Honorable Mention

Faceware Technologies’ Faceware Live UE
plug-in –
Although Faceware introduced its Faceware Live two years ago, integrating it into Epic Games’ Unreal Engine is a move to be applauded. This will allow UE 4 developers to capture facial movements with any camera and apply them to characters in the Unreal Engine. The resulting facial animation can be used for a range of applications. 

Special Recognition

Christie’s projection technology – When it comes to display technology, it’s difficult to ignore Christie and its wide array of projectors and display offerings. At this year’s show, the vendor exhibited the latest in projection mapping within the Emerging Technologies’ Sandbox. With auto-calibration and a patented process, Christie condensed a complicated procedure whereby a 3D-printed apartment building was projection-mapped in real time. Also, those walking into the Emerging Technologies area were met by a huge 3D projection-mapped skull created by artist Josh Harker and Theatrical Concepts. Presented as a 360-degree display onto the 12-foot skull, the projection mapping was powered by four Christie Roadster HD20K-J projectors. 

Special Recognition

Pixar’s USD (open source) – Pixar says it intends to release its Universal Scene Description (USD) software next summer as an open-source project, which is expected to lead to increased efficiencies across workflows. USD addresses the need in the CG film and game industries for an effective way to describe, assemble, interchange, and modify highly complex virtual scenes between digital content creation tools. With USD, artists can simultaneously work on the same collection of assets in different contexts, using separate layers of data that are composited together at various production stages. USD generalizes these concepts so they are available to any DCC application.