Character Production in Unreal Engine's MetaHuman & XGen

Nick Gaul is sharing the making of King Grayskull using MetaHuman, highlighting how the unique hairstyle was created to present realistic character dynamics. 

About The Project of King Grayskull

This piece was inspired by a Master of the Universe character named King Grayskull. I loved the character design and wanted to try and bring it to life. The project was a collaboration with Chay Johansson who did all the CFX and simulation. (MetaHuman > Maya workflow > rigging custom mesh back to MH).

We wanted to push the knowledge of character creation more in Unreal Engine and work with animation. To start the project I created a generic MetaHuman as the basis for the character, and got this exported to Maya. Using the MetaHuman mesh and rig allows us to use a large amount of data already created for these characters and build on top of that.

After most of the base asset was sorted and proportions locked down, it was rigged up in Maya and then this was used in Unreal to warp motion too. Utilizing Unreal’s IK (Rig Animation) Retargeting, we warped across various motions to test on and use. We imported the idle motion back to the character in Maya, allowing us to run out of the wardrobe and hair simulation.

This was a combination of Ziva, nCloth, and nHair. The shawl hair was set up using interactive XGen Groom and linear wire deformer to drive the groom, and this allowed it to follow the simulated wardrobe and give it some freedom to move. The front braids were simulated using Ziva, with a Solid Geo Mass as the SimGeo, and then binding the braid curves and braid ties to this. The curves were then exported to Unreal. 

Facial Rig

For the facial rig, the bust of the character was brought back into Unreal, and a MetaHuman face was created. This generally only gets you part of the way, as it can only try and match the face you track to the best it can with the default MetaHuman face. Once you get this tracked, and a MH face created, you can then send it back over to Maya via Bridge, and adjust it further to match the custom face geo 1-to-1.

This is done by gutting out everything you don't need and lining up the face as close to the eyes and face plane as possible. The next step is to take the bust geo of MetaHuman and your custom geo for King Grayskull, and create a 1-to-1 MetaHuman face that matches your custom geo exactly using Wrap3.

Once this is made you can apply this to the MH geo in your scene as a delta offset. At this point, it should match 1-to-1 with your custom geo and you can bind this and adjust the teeth, eyes, and all the other geo pieces accordingly. 

This allows you to still use the MH face rig inside Maya, apply mocap/live link data to the controls, and allow further tweaks to the mocap data, and any other animation that is needed. 

Facial Motion

For the facial motion, we used the live link app, which allowed us to take our own facial performance and drive the MetaHuman. You can technically do this on any MH when capturing within Unreal, and then exporting this motion out as an FBX matching the control rig, this was then just a matter of transferring this data straight to the control rig within Maya.

Once we have all the animation data applied to our character and the wardrobe and hair simulated, we export it all as USD and Alembic ready for lighting and set up inside Unreal.

Creating Hair

King Grayskull has a cool hairstyle consisting of dreaded locks. I've done some dreaded hair in the past professionally, but that was for a film pipeline, and knew if I used the amount of density/CVs required for that approach I'd most likely crash Unreal.

I came up with a way to limit the hair by only focusing on the outside shell of the dread. This would have been time-consuming to detail all the guides required for something so complex, so I decided to build a template and copy that to all the dreads as a start.

This is done by building the curves around a cylinder, wrapping those curves to that cylinder, and blending them to the target tube volume shaped on the head. This gave me a lot more control over the look as well as sped up the creation of the hair significantly, and kept the density and CVs much lower.

I used Maya XGen for grooming. I find XGen very easy to use and also gives me the flexibility to customize deformers using expressions, which gives me a balance of procedural aspects and control. Once finished with the hair, getting them into UE was easy by just exporting them as Alembic caches.

Using Unreal Engine really opens up so much more to your workflow. It allowed us to push our work even further from the realm of static assets to fully realized animated characters that are fully shaded/lit, and put into an environment very easily.

The use of Quixel Bridge assets allowed me to build the scene extremely quickly along with custom assets such as the cauldrons and lamps, which were created in ZBrush and textured in Substance 3D Painter. With Substance 3D Painter’s USD export, I could simply one-click send the fully shaded/textured asset to Unreal ready to go. I am really looking forward to diving further into Unreal and pushing these workflows further!

Nick Gaul, 3D Artist

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more