Breaking Down Multi-Pipeline Trailer Production for a 2D Arcade Game

The Heroic team has walked us through the multi-pipeline production process behind the Ezra and Evil Fog trailer created for a 2D arcade game Merge Dragons.

Intro

What does it take to create a trailer?

There is a lot that goes into trailer production – storyboarding, modeling, animation, VFX, rendering, etc. That requires significant investment – both time and money. That makes trailer production somewhat limited for small projects, yet not impossible. So how can you make it all happen while delivering high-quality cinematic trailers to your client's expectations?

Why Multi-Pipeline Production is Beneficial

Multi-pipeline production gives you enough flexibility to approach creating trailers in a better way. If you have less options you can't capitalize on the unique benefits of different engines. As a result, the feasibility of your production process increases, justifying the means according to the chosen objectives.

That’s the strategy that the Heroic team used to create a series of trailers (one lengthy primary trailer and four shorter ones) for Merge Dragons – a 2D mobile arcade game. Here is what you can learn from their experience to speed up trailer production while achieving stunning visual results.

Production Pipeline Breakdown

The team used two production pipelines to achieve their goals – Maya-Arnold for the primary trailer and Unreal Engine (and all of its bells and whistles) for the other four.

Pre-production

Pre-production is the process of setting up a stage so the story can happen. In the case of Merge Dragons, the story unfolds through two focal characters – Ezra and Evil Fog and the setting where they are placed.

Characters

The Merge Dragons game has a well-established lore, but players never quite get a good sense of the main characters' personalities; in fact, we never see them confronting each other in the game. Using the existing 2D game assets simply wouldn’t work for the story the team was willing to tell, so they had to work up these models, using the game as a reference point.

The team received T-pose concept arts of the main characters and used them as a basis for ZBrush modeling. An interesting fact about Ezra, the game’s main heroic dragon – initially, this character was meant to use its small legs to walk on the ground. It became clear later that this wouldn’t look too convincing, so the team suggested making Ezra hover in the air, kind of like Watto from Star Wars.

Evil Fog, the game’s villain – a dark cloudy substance, was a tougher nut to crack. Since we never got a good look at this character in the game, the team had to build up its personality. How do you create a character from something that is … fog? The team drew inspiration from characters like Clayface (DC) or Smoke (Mortal Kombat), deciding to blend Evil Fog’s head and shoulder with a shapeless cloudy mass to disguise its silhouette. 

Once conceptualizing was done, the team started working on modeling and rigging. The team had known since the beginning that they had to hold back on using high-polygonal modeling. One reason for this was Unreal Engine: the older engine versions were not that great at handling high-density mesh, and the team didn't want to run into any potential optimization problems. 

Finding the right balance between the right number of polygons, rig controllers, and the required animation quality in UE and Arnold took some time to figure out. After some initial experiments, the team knew their preferred settings, including those they could use to export animations from Autodesk Maya to Unreal Engine in the FBX format. 

The team decided that high-polygonal modeling wasn't necessary for the selected art style. Instead, creating so-called mid-poly models was a middle ground they were looking for. However, that wasn't the case with textures which the team intentionally made very detailed (they had quite a few texture maps and shaders) in Substance 3D Painter – something you definitely want for physically-based rendering (PBR). 

The same textures made in Substance 3D Painter were used across all five trailers. The team used the default export setting for Arnold but made some adjustments for Unreal Engine. That's how the two main characters were created, but the team wanted to do something special with Evil Fog.

Evil Fog is quite an intriguing villain, and the team thought it would be fitting if they added some extra animation to this character. As such, they decided to create dynamic fog effects that they could later use on top of the Evil Fog's model, and Houdini was a perfect choice for that task.  

Houdini allows using volumetric effects to create a convincing cloud, fog, haze, and smoke simulations. At the same time, lighting and shading artists can control different properties (density, color, absorption, scattering, etc.) to achieve the desired effect. Using Houdini, the team created effects that were visually aligned with the trailer's art style—convincing yet not too photorealistic. 

As much as the team loved the Houdini effects they created, bringing them to UE was somewhat problematic as the VDB files weren't supported at that moment. After some consideration, the team decided to recreate the same effects using the UE toolset instead of puzzling over the best way to import 250 GB of Houdini simulations.

Environment

The world of Merge Dragons consists of biomes in the form of Avatar-style floating islands. Most of the terrain was modeled in Maya, although the team wanted to add extra complexity to the main trailer. As such, they created some nuanced details for Arnold, such as rocks, plants, trees, and grass. 

Yeti was of great help in creating grass in Maya: the team used a grooming tool to generate this asset quickly. As the name suggests, this tool is commonly used to make hairstyles, but the team managed to repurpose it. A great thing about Yeti is that it allows for managing changes quickly. That came in handy later when the team started making adjustments based on the client feedback (too photorealistic, too aggressively fluttered in the wind).

Exporting grass produced in Yeti to UE was probably the best way moving forward, but the team decided to go with a different approach due to technical limitations. Using some of the already-existing assets accelerated the UE development, which was also complemented by Landscape Brush.

As a video game engine, Unreal Engine has many neat level design features you can use to build fictional universes. Using Landscape Brush, you can establish the landscape elevations, depressions, and transitions between them. Based on your choices, Unreal Engine can suggest the right shaders (e.g., grass for flat terrain or rocks for a steep hill). 

In general, you can create your assets in many ways: you could use Houdini's SpeedTree or similar tools to create trees, plants, flowers, and grass. Speaking of which, the team finalized the environment design by scattering grass in UE, making it visually similar to Arnold.

Production

At the production stage, there is a lot of going back and forth with transferring your assets across the pipelines. There are two main ways you can do so—use either FBX or Alembic files. Alembic is essentially a file container that can transfer geometry, simulations, camera movements, and other attributes. 

The team started assembling scenes shot–by-shot using Maya and Unreal Engine. The team finalized the Maya-Arnold pipeline first and used that as a visual reference point for the UE development. Reusing the existing visual effects could have saved a lot of time, so the team attempted to import the Houdini files to UE.

Soon enough, they stumbled across some technical issues—it appeared some Alembic properties were not correctly displayed. In the end, the team devised a solution to recreate some assets in UE and import the rest using the FBX format.

In simple terms, the production process for the trailer involves constructing scenes, adjusting camera angles (and other elements that are unique to every shot), rendering, and moving on. However, the team discovered they needed to make some improvements at that point. 

The team wasn't entirely happy with Ezra's eye movements—they looked a bit blunt on the rendered images. The team decided to recreate them as separate objects consisting of eyeballs, retinas, and pupils. After applying the relevant shaders, the team rigged them so Ezra's pupils could shrink and expand. That late addition significantly improved Ezra's facial expression.

Lighting and shading

Lighting and shading involve setting up artistic lighting, reflections, shaders, and pre-render settings. As this process tends to go a lot quicker in UE due to instant visual feedback, it takes significantly more time in the Maya-Arnold pipeline. The team wanted to accelerate the lighting and shading process, so they assigned additional artists to work on the same shots. 

Despite a quick production ramp-up, that decision presented some additional challenges: each lighting and shading artist has a unique touch; thus, stylistic inconsistencies can start growing over time. To prevent this from happening, you need a single collaborative framework, and that's something other than what Maya is best known for. 

To address this problem, the team first created blockouts with rough animations to ensure all transitions synced with the timeline. Then they rendered a few beauty shots to approve with a client and use them as a reference point for the lightning and shading artists. Finally, after viewing some playblasts, it was clear that everything worked as intended.

Unreal Engine is a different beast—with ease, several artists can work there on the same scene simultaneously. Any changes can be previewed almost instantly (while it takes quite a bit of time in Arnold), making a collaborative effort a bit easier. Still, the team used Perforce as a version control system to manage risks and instantly update changes in any shared assets.  

Compositing

Compositing is all about setting up accents that contribute to the overall feel of a trailer. That's also a great opportunity to fix something you overlooked or to tune up visuals, making it contrast, blur, or add some additional effects – like magic simulations that were rendered separately and later added on top. 

At this point, the team had quite a few render passes to process: they applied depth of field and altered the passes to fix some lighting, glowing, and shading issues. One of the most noticeable improvements was smoothing out reflections on Ezra’s skin.

Post-Production

Color correction is a logical continuation of all the steps concluded previously. As much as you try to keep the production process consistent, it's a collaborative effort of many people, so minor color differences between shots are inevitable. The team balanced out colors and moved to the next stage – color grading. 

It's vital to understand how rendering works in both pipelines to explain why color grading is necessary. UE uses simplified computing and is limited to the number of render passes it can process. However, it's still capable of producing a "juicy" enough image giving your art direction is on point. 

On the other hand, Arnold has extremely sophisticated rendering capabilities and produces highly detailed images. Yet the team felt the images they ended up with were somewhat "dry," so they tuned them up using Fusion and DaVinci for that purpose.

Bottom Line

Using multi-pipeline trailer production has its perks and challenges: if executed wisely, it could deliver a true HD cinematic experience even for relatively small projects. However, it all comes down to knowing your production process and capabilities through and through. 

Your key ingredients are knowing limitations, planning everything beforehand, and ensuring consistency. But even more important is trusting your team, giving them enough creative freedom, and allowing them to do what they do best. 

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more