March 2, 2008

Insight into Ubisoft and Assassin's Creed

Windham, N.H. - Computer Graphics World interviews Charles Granger, lead compositor at Ubisoft Digital Arts. Ubisoft was established in 1986 by 5 brothers, the Guillemot’s brothers, says Granger. At first, the company was specializing in the distribution of games and educational software. In 1992, Ubisoft expanded its activities and started producing its own products. Today, the company has 15 studios of development throughout the world. Ubisoft has created successful products like Tom Clancy’s Splinter Cell and Tom Clancy’s Rainbow Six franchises, the Prince of Persia series and, recently, Assassin’s Creed. Ubisoft also produces games for external franchises, such as Peter Jackson’s King Kong and Open Season from Sony Animation Pictures. The group distributes its games across the globe. In 1997, Ubisoft opened its Montreal development facility and currently employs more than 1700 employees.

What specific skills do you think Ubisoft possesses that made you the perfect choice to work on game cinematics?

For a while now, many people rightly believe that Montreal hosts a lot of good creative digital craftspeople. Ubisoft is working closely with different education institutions and focuses a lot of effort in in-house training. Ubisoft has everything to carve itself a place in the fantastic world of broadcast. More specifically, the employees’ creativity is the best asset to set Ubisoft apart from the competition and achieve this goal.



What was the outline of the specific project?

It was a trailer in which we would create a myth surrounding the Assassin. We set the action in a dark and mysterious environment where templars discuss the veracity of the Assassin’s presence and his power. As they argue, the Assassin infiltrates the room and kills one of them without being noticed.

On average how many shots does Ubisoft complete for a given project?

The most recent trailers have about 30 shots each.

What software applications are used in your production pipeline? (What specific tools were used and why? How did Fusion fit into the overall production pipeline?)

We started by converting the models from the game [3DMax and ZBrush] into Softimage XSI. Softimage XSI allows different departments to work in parallel and offers good animation tools. Softimage XSI also has a good layer separation tool. These layers are then sent to Fusion to get specific effects and an optimal control over the different layers.

What factors are specific to the game cinematic industry and what makes it different from traditional feature animation?

The difference lies in the game cinematic that goes into production during game development. As the game evolves, the storyline and the characters are bound to change and we are working on good communication channels to keep everyone synchronized. Otherwise, the factors resemble a lot of what we go through during the production of a traditional feature animation.

What are some of the biggest challenges you face on your projects and how do you overcome them. How did Fusion help in this regard?

Our biggest challenge was that once all of our layers were rendered and composited some changes were asked for. The artistic director, Nicolas Cantin, worked with each compositor to finalize all of the shots. Thanks to Fusion’s flexibility, we were able to finish and deliver this heavy project.

What was the most rewarding effect that you feel you pulled off?

As we were working on each and every shot, we got so close to the images that we forgot about the entity. The reward was definitely when we saw the final result with sound effects and music. Technically speaking, I think that having added hair on the templars’ clothing helped a lot, and definitely made it possible to increase the realistic aspect of the characters. I believe that we succeeded in delivering a very nice project in a short period of time.

What is the minimum process for a shot in Fusion, and the maximum? (How may render passes were done on average per shot?)

The opening shot was definitely the most demanding for all the passes that were required, we are talking about roughly 20 layers; each of which consisted of the characters, the camera and the environment filled with objects, all were lit by a multitude of candles. Each of those layers were composed of the following passes: diffuse, dirt, specular, reflection, rim light, Y and Z depth, shadows, smoke, fire, hair, vector blur, and a whole group of different puzzle mats.

As for the rest of the project, the shots were pretty similar. To accomplish the latter, we processed a dozen layers all in the same way. In that manner, we would ease the comp and keep a nice continuity between all the shots.

What role did the 3D environment in Fusion play?

In order to increase the realism of the project, we decided to add flying dust in the air. Of course, we manipulated particles in Fusion and, since most of the shots had minimal camera movements, it was not difficult to integrate those particles. However, in the intro shot, we had a camera movement that started from the ceiling with a bird’s eye view of the room; it trucked into the floor, and then followed with a 360º for the entire room. For this shot we imported the 3D camera and used it within Fusion to generate particles. This was a ball!

How extensively do you make use of Fusion scripting to customize or automate your workflow?

For now we have only one script that allows us to communicate with our render farm. Recently, we handed-in a whole list of scripts that we find essential in the automation of certain procedures. Our technicians are already working in creating those scripts.

Did you make use of the masking features and how?

We always find ourselves using masks, even in full 3D projects. Those masks help us in keeping our main characters apparent and increase their presence; they can also be used for colour correcting certain areas of our shots.

Why did you choose Fusion to accomplish the shots?

Fusion has been our main compositing tool for the past two years. Fusion offers great flexibility and the compositors find the software user friendly. It has all the tools and performance needed for achieving great projects on short deadlines.

How vital were Fusion’s roto capabilities on the project?

Roto was heavily used on this project, we were able to darken and brighten certain areas depending on which parts of the sequences we wanted to put more or less emphasis. It is very flexible, and everyone finds it easy to work with.

Are there any new features in 5.2 that you feel will further help you in your pipeline?

Since we usually add motion blur in comp, I am eager to see what the Vector Motion Blur tool has to offer. Furthermore, having added the Python Scripting will definitely give us a better communication between all the different tools we use in our department.

How deadline driven was this project and how did Fusion comply and assist in this?

Due to the tight deadlines, we worked on the art direction of each shot up to the last minute. We also found ourselves pre-comping separate elements in order to build an easy workflow. Fusion is very flexible in allowing us to apply different methods depending on the projects. Also, this way we were able to provide new versions every day for the editing department.

What can we expect from you in the future?

Well, it’s obvious we are going to bring the software up to its limits. We are always working to get the best images out there and it’s great to challenge FUSION in allowing us to get there.