Mold3D to Share 'Slay' Animated Content Sample Project with UE Community
September 8, 2021

Mold3D to Share 'Slay' Animated Content Sample Project with UE Community

With a resume that includes roles at Industrial Light & Magic and Dreamworks Animation, and credits on The Matrix trilogy, Avatar, and The Mandalorian (among many others), Edward Quintero, CEO of Mold3D Studio, has over 22 years of experience in creative leadership, animation, visual effects, and environment art.
In a bid to inspire and educate artists, Quintero and his colleagues at Mold3D Studio created Slay, a sample project that explores animation and virtual art department techniques aimed at film and TV content, to share with the Unreal Engine community. It will shortly be available for download.

The project contains everything you need to create this video trailer, which was rendered entirely in Unreal Engine 4; the content is free for you to use in your own projects, too.

Today, you can already download the hero character, Windwalker Echo, for use in Unreal Engine 4.27 or Unreal Engine 5 Early Access; you might recognize her from the Unreal Engine 5 reveal demo Lumen in the Land of Nanite, and the Valley of the Ancient project that accompanied the UE5 Early Access release.



Mold3D Studio’s real-time technology experience
Quintero formed Mold3D back in 2016 after he became interested in using real-time technology, and realized how it could transform the world of content creation. At the time, he was collaborating with Epic, working with Unreal Engine on projects such as Paragon and Robo Recall.

“Real-time technology struck a chord with me, and I thought that's where I should focus my energy,” he says. “I felt like it was the future, because I was able to visualize 3D art in real time, instead of the days and weeks that traditional rendering required.”

With some Unreal Engine experience now under his belt, Quintero joined Fox VFX Lab, where he was asked to head up their new VAD (virtual art department) and build a team. In these early days of virtual production, Quintero was using Unreal Engine to create pitches for films and to visualize environments for Quintero directors, enabling them to do virtual scouting, and to set up shots, color, and lighting that were then fed to the visual effects vendor where they would finish the film.

After his time at Fox VFX Lab, Quintero and his team were asked to be a part of the VAD for The Mandalorian. “It was the foundation of us starting up a studio solely devoted to the art of being a real real-time studio,” he says. “I was trying to build for what I saw that was coming—the future of visual effects. We could all feel that this was happening.” 

Shortly thereafter, Mold3D Studio was invited back to join the VAD for The Mandalorian Season 2. Around this time, the studio was also approached by Epic to work on the Unreal Engine 5 reveal demo. They put the experience gained on previous projects like The Mandalorian to good use when tasked with creating extremely complex and high-resolution 3D models to show off Nanite, UE5’s virtualized micropolygon geometry system. 

“It was exciting to get a taste of what’s coming in UE5 in the future,” says Quintero. “We’re currently using UE4, until UE5 is production-ready. There have been some great advances in the latest UE 4.27 release—especially in the realm of virtual production—but features like Nanite and Lumen are really going to change the game.”

Virtual production techniques help Mold3D Studio create Slay in a pandemic

After the UE5 Demo wrapped, Quintero began talking to Epic about Slay. The proposal was to create a finished piece of final-pixel animated content in Unreal Engine. With the company now starting to get a name for environment art, they were excited to illustrate their expertise in story development and character design. With the exception of Windwalker Echo, the Slay assets, including her adversary, were all designed and created by Mold3D.

Just as Slay was getting greenlit, the pandemic hit. Quintero and his team set up a remote working environment that would enable them to work on real-time rendered animated content, as well as other projects that they had on their books.

“We quickly created a way to enable our company to work remotely on an animated short by making our pipeline virtual,” said Quintero. 

Interestingly, it was virtual production techniques that ended up making this all possible. With the mocap happening in Las Vegas, Quintero’s team in Burbank directed the actors via Zoom, while viewing the results on the characters in real time in Unreal Engine, making it easy to ensure they had the takes they wanted.

“Although we probably would have done a lot of things the same way we had if there was no pandemic, we were thankfully able to rely on the virtual production aspect of the filmmaking to save the day,” says Quintero.

After the main motion was captured, the team did a second session with the actor just for facial capture. For this, they used the Live Link Face iOS app.

“We were able to look at her takes with the recording that came out of the iPhone and also, on the day, we could see the camera looking at her,” says Quintero. 



Crafting the look
The team had previously modeled the assets in Maya and ZBrush, before blocking out the animation in Maya and bringing it into Unreal Engine via FBX, where they also blocked out the cameras in Sequencer, Unreal Engine’s built-in multi-track nonlinear editor. Taking advantage of the engine’s ability to render the files in real time, they brought in animation daily, starting in a very crude state, even while the models themselves were still being finalized.

“It was great to see the previs with light and color,” says Quintero. “You gained a taste of what it was going to look like right away, instead of having to wait a few months to start getting a little bit more involved. That was valuable, as it helped us visualize and finesse the look along the way.”

For look development, the team used lots of Unreal Engine’s materials and shaders, including vertex shaders and decals to not only give a unique effect, but to help maintain real-time performance.

“It's combining tricks, techniques and processes that were learned in the years I spent working in the visual effects and animation industry, with the benefits of being able to quickly iterate and visualize the results in real time,” says Quintero.

In addition to this, they used Unreal Engine’s Landscape toolset to create the terrain, and Quixel Megascans—which are free for all use with Unreal Engine—to populate the environment. Effects, such as the glowing orb, were mostly done in Niagara, Unreal Engine’s visual effects system.

Lighting played a key role in the look of the project, with the team taking advantage of Unreal Engine’s real-time ray tracing capabilities to produce sophisticated effects. To finesse close-up lighting on the characters, they built movie-style lighting rigs in Unreal Engine, enabling them to create beauty lighting, rim lighting, key lighting, and so on.

“We had a certain look that we were going for,” he says. “I think originally, we wanted this to be very stylized, like a manga/anime kind of thing, but it went more towards trying to make it look realistic. And it ended up being a little bit of a hybrid, not super photoreal, but it has a little bit of a stylized tinge to it. The lighting was a big part of that; we worked with several lighters to get the correct look.”

Real-time lighting was one of the things the team most appreciated, as it gave them the ability to easily make changes.

“If it’s not working for the shot, you can quickly move the sun direction or quickly move the character light and kind of feel your way there, instead of having to light, render overnight, come back, check your renders and not be happy,” says Quintero. 

Faster iterations, parallel workflows, nonlinear decision-making


Overall, apart from the obvious benefits of being able to render out the frames in fractions of seconds rather than minutes or hours, the team at Mold3D Studio found using Unreal Engine for creating linear animated content had many other benefits.

“It means that you can do multiple aspects of production at once. You can do look development earlier than you probably would in the traditional animated pipeline, in parallel to animation,” says Quintero. “You start seeing it in context, and you can make a lot of decisions based on that earlier. You still have a lot of flexibility in making decisions on composition, timing, and lighting. You can change your camera and your mind, and it won't cost you a complete redo.”

Mold3D Studio found producing Slay to be a tremendously rewarding experience. "It wouldn't have been possible without the creative collaboration and continued support from the team at Epic Games,” says Quintero. “They were there for us every step of the way. Our company grew by leaps and bounds on this project, and we look forward to many more cool productions in Unreal Engine."