Making the Goddess Diana in ZBrush, Maya & Substance 3D Painter

Claudia Luehl shared the workflow behind the Goddess project, explained how the horns were made in ZBrush, and talked about creating hair in XGen.

Introduction

My name is Claudia and I am an international student from Germany, focusing on 3D Characters for games and cinematics. After finishing high school and finally realizing that working on games is an actual career path, I moved to Berlin to study Game Design at the Media Design Hochschule. After getting my Bachelor of Science in 2018, I started working at HandyGames Studio in Giebelstadt and had the chance to work as a Character Artist on such games as Townsmen VR and Stunt Kite Party. Even though I loved my job, I still dreamed about working on epic titles like The Witcher 3, God of War, and the Mass Effect series, games I loved playing myself and which inspired me to become an artist in the first place. But I knew that I still had to learn a lot to get there, so I decided to quit my job and enroll in the 2-year program at the Gnomon School of Visual Effects in Hollywood. I graduated in September 2021 and got my Certificate in Digital Production with a focus on modeling and texturing.

The Goddess Project

This project was done for my Demo Reel class during my last term at Gnomon. My goal was to get out of my comfort zone of real-time characters and make a more complex one with UVs spread over multiple UDIMs and a groom done in xGen and rendered in V-Ray. I hoped to demonstrate everything I had learned in the past 2 years through modeling, texturing, rendering, and compositing. This piece is based on the beautiful artwork "Diana" by Peter Polach. It was more of a painting than a concept, but I loved everything about it! I knew it would be tough since I'd never done such a complex piece before and I would have to finish everything in just 10 weeks, but I really wanted to nail it!

Before starting modeling anything, I collected a ton of references of animal skulls, bones, human anatomy, and skin details. A good reference is the key! I also made a plan of attack and a list of all the things that are unique, which might be reused, and what kind of ornaments would be created by using Displacement Shaders. This way, I got a good impression of the project and it prevented me from wasting time in the long run.

Modeling the Body

Since I was not planning to animate the character or show it from other different angles than in the concept, I decided that it’s best to split the bust and arms into separate pieces. While I was planning this, the words of my teacher echoed through my head "If you don’t see it in the render, delete it!" And so I did.

To save time, I reused the character model I made for my Character class a few terms ago. It already had very nice topologies so I could skip making a retopo and save a lot of time. It even had UVs, but they were made for a game character and spread over only one UV tile. Since I knew from the very start that I wanted to make a high-resolution character with detailed skin and fine Displacement Maps, I needed high enough texture resolution. So created new UVs and spread the layout over multiple UDIMs. I then started sculpting the face in ZBrush by adding primary and secondary details and breaking up the symmetry of the face.

The arms were a bit trickier. I had never done hands with such long nails before and was not sure how to make them. But I guessed that they had to be supported by some good topology. So instead of trying to sculpt the nails in ZBrush, I brought the cut of the arm into Maya and turned on Smooth Display to check how the current low-resolution model would look after getting subdivided. I then edited the topology and experimented with extruding the nails, always switching back and forth between the smooth and normal display, checking where to add some more cuts, and inserting edges so the topology would support the shapes I wanted. Once I was happy with the result, I brought the arm into ZBrush, left the nails as they were, and started adding some secondary details. All smaller details, like pores and fingerprints, I decided to add by using Texturing XYZ Displacement Maps that I would projection-paint on the mesh in Mari.

I then duplicated the arm 3 times and started posing it and the bust by using the Transpose Master in ZBrush, but I kept one unposed arm as a backup for texturing since all arms were all sharing the same UVs. Unfortunately, I was unable to check if my model aligned with the concept in ZBrush, since the camera is never exactly like the render camera that I set up in Maya. So, I had to jump back and forth between Maya and ZBrush, comparing my current model to the concept that I projected over the model by using an image plane. If it was off, I had to either rearrange the objects or go back to ZBrush, make some changes in the model and then reimport it.

Texturing the Body

I textured the bust and the arms completely in Mari since, in my opinion, it is the best program for skin texturing and adding all these very subtle details. It allows projection-painting even very heavy texture files and the Node Graph makes it easy to organize complex scenes.

For texturing, I used high-resolution face scans and displacement tiles from Texturing XYZ maps as they offer incredible Displacement Maps with secondary, tertiary, and micro surface details, each included in a different RGB channel, which can be adjusted separately during LookDev. I projection-painted the displacement tiles onto the model in Mari and imported the created Displacement Map into Maya. I then created a Displacement Shader which combined the Displacement Map from ZBrush that supported the SubDivision levels with the one from Mari to add finer details.

For the Diffuse Map, I used a face scan of a young woman from Texturing XYZ, which I projection-painted on the mesh. I used this as my base, but I had to color-adjust it since the character is very pale. After that, I added a ton of colors and masks to create little details and skin variations. In skin texturing, everything is about subtlety. I mostly used these 3 alphas when painting the masks as they feel much more organic than a soft round brush:

Modeling

After having the body done, I started sculpting the rest of the scene one piece after another. Since I rendered my scene in V-Ray, I was not limited by engine performances and was able to use meshes with a higher polycount than for real-time characters. I did not want to overdo it, of course, but it gave me more freedom when creating the retopos and UVs, especially since my scene was rendered from one camera angle only.

I created most of the non-organic elements in Maya by creating surfaces using curves. For example, I made the golden borders by drawing some curves and adjusting them in the scene, always making sure they aligned with the overlaying concept. I then connected these curves by using the Loft Tool. For the bow and antlers, I used a very similar technique. I aligned the curves so they matched the concept, then I used Curve to Tube Mesh in the Bonus Tools plug-in. It allowed me to control the width, taper, and density of each tube and allowed me to match the created mesh very close to my concept while still being able to change its course by editing the curve points. Once I was happy with the basic shapes, I deleted the history and brought them into ZBrush. I then DynaMeshed the tubes into one shape, smashed on the ZRemesher, and it was done. All finer surface details, like scratches and wood carving-like surface details, were created in Substance 3D Painter.

An exception is the horns of the buck skull. These meshes are very dense because I needed enough topology to support the grooves in their silhouettes. I had some trouble with them, but then I found this awesome tutorial from Glenn Patterson – Quick Horns Zbrush 2019, that allowed me to create a horn by duplicating a simple base mesh while still being able to control the rotations around different axes. I then DynaMeshed the shapes together, smashed on the ZRemesher, projected the details, added some more surface noise combinations in ZBrush, creating quick UVs in Maya, and it was done.

I started making animal skulls by using a PolySphere as a base and imported my reference images into ZBrush by adding them to SpotLight. I then sculpted the skull, blocking out the rough shapes first while always rotating around the model, checking it from every angle, and comparing it to my reference. At this stage, I kept everything quite loose and DynaMeshed my mesh a lot, not caring about topology yet.

Once I was happy with the shapes, I used the ZRemesher tool to create a low poly version, low enough to import it into Maya but still dense enough to support all the primary and secondary shapes. The last thing I want is chunky outlines! Since the basic anatomy of the animal skulls was very similar, I was able to reuse some of the models, which was a huge time saver. I simply had to stretch certain parts and change or add some surface details. A really great brush pack I used a lot to add cracks and noise was the Orb Brush Pack by Michael Vicente. After finishing sculpting the skull, I saved this current high-resolution version as a backup.

I then prepared the mesh for unwrapping by filling it with white and painting the areas that would not be visible in the final render in a different color. By using the From Polypaint button under the Polygroups section, I was then able to split the skull into polygroups. Having these polygroups will be essential for the next step. I went to ZPlugin, UVMaster, activated the Polygroups button, and hit Unwrap. ZBrush then automatically created the UVs, having the seams generated along the polygroup borders. If I wanted more seams, I could control the exact location by simply painting and adding more polygroups. The UVs may not have been perfect or always symmetric, but since I was having a very limited time, this was the fastest and most efficient workflow with clean results I came up with. I repeated this procedure with all the mammal skulls.

I then exported the lowest and highest SubDivision level of each model for later texturing in Substance 3D Painter. When doing so, always make sure you named your mesh correctly, otherwise, you will have trouble when trying to bake your meshes. Always let the high poly version model end with "_high" and the low poly model end with "_low". This is important, you will see why.

Texturing

In preparation for texturing my scene, I brought all the skulls into Maya. I assigned the same material to all of them and arranged their UVs by spreading them over multiple UDIMs, giving each skull its own tile. I then brought these skulls into Substance 3D Painter 6 for texturing.

One thing that really helped me speed up my workflow was that in this updated version of Substance 3D Painter it was now possible to work with multiple UDIMs. It even offers to bake and export separate UDIM tiles, which is a huge deal and saved me a lot of time and headache.

Before this was possible, I would have packed all objects of the same material into one big 2k, 4k, or even 8k texture. It works, but it can become very messy and confusing. Everything has to fit in one layout, but what about when you have to make changes or have to add one more asset? You have to rearrange everything! So this time, I split them into smaller UDIMs with lower resolution. This saved me a lot of time rearranging my UVs and kept them very organized. And since I was now able to bake single assets, I did not have to bake all high poly models together, which can take a lot of time when they are very dense.

But I should mention that this method does not work for characters for games, since they do not use UDIMs.

My texturing process in Substance 3D Painter is quite simple. I mostly use a lot of fill layers that I mask out by loading a Grunge Map into the mask. I then often pack this layer in a folder that I assign, then another Grunge Map as a mask for an extra level of break up. These fill layers contain color variation and often – very slight Roughness variation, which adds a very nice effect and surface break up. The intensity of this effect can be controlled by changing the layers' overlay mode and opacity level.

For the green fabric, I created a material shader in Maya. I first thought about texturing the fabric in Substance 3D Painter, but I wanted to have the freedom of adjusting the light green pattern in Maya for faster results and did not want to update my maps every time. To create a velvet-like effect for the fabric I used the Ramp and the Sampler Info nodes. This way, the colors and the fall-off would change depending on the position of the camera. To add some more irregular, velvety surface noise, I used a photo of velvet that I edited in Photoshop to make it seamless and used it as a mask for a slightly darker color than the base. For the light, vine leaf-like pattern, I created a seamless alpha in Adobe Photoshop and used it as a mask.

For the golden textures and ornamental details, I used a lot of Displacement Maps with repetitive or seamless patterns. Except for skin texturing, I had never really used Displacement Maps before and sculpted everything. But the problem is that it’s often difficult to make bigger changes once the topology is changed. By using Displacement Maps, I had much more flexibility, so I will use this in future projects!

For the golden borders, I decided to use a seamless pattern, so I could control the denseness of each border by scaling its UVs along the X-axis. For this step, functional and straight UVs were a must. To speed things up, I used a Maya plug-in that I bought from Malcom341 to quickly gridify my UVs. It costs a few bucks, but in my opinion, it’s worth it. I have been using it for quite a while now, and it saved me so much time by giving me instantly clean UVs, perfect for repetitive patterns and essential for the technique I used.

I created the stitching pattern in ZBrush by using the free brush pack from Victor Franco on ArtStation. I set up a plane and drew the pattern, I then looked at it from the front view and created an alpha of it by using the Grab Alpha from Document function. I brought this pattern into Photoshop and arranged it a bit so it would look nicely seamless when the UVs of the mesh would reach over the UV tile.

For the golden ornaments of the hunting horn, I used a similar technique.

I drew some ornaments in ZBrush by using a custom-made tube brush that thins out at the ends. Just like with the stitches, I created an alpha and brought it into Photoshop. I aligned and straightened out the UVs of the golden rings, created a UV snapshot, and brought it into Photoshop, marking the UV faces where I wanted the ornaments to be. I then only had to add and scale the created alpha and align it to the UVs. For the flower carvings at the base of the hunting horn, I created a simple pattern in Photoshop. Just like with the ornament, I aligned the flowers to the UV layout of the mesh.

Making the Hair

I created the groom in Maya by using XGen. It was a real challenge since I am not a groomer and used it only for two other projects before, but thanks to my Instructor Tran Ma, her constant help and feedback, and a lot of YouTube tutorials, I finally got a nice result. 

Before starting the groom, I made sure that I set my project so my XGen collection would be created at the correct location. I then created some hair caps that I used as the base for the groom, so I would still be able to make changes to the bust, without the fear of accidentally destroying my groom. 

I decided to create 3 descriptions for the main hair. The left part, the right part, and long single strands. All of them would have the same kind of modifies, like noise and clumps, but I still decided to split them into separate parts just because it would make it easier to control all the many guides and quickly hide the ones I did not need. 

To create the guides for the curls, I started with a low-res curve that starts at the scalp and defines the length and direction where I want the final curl to go. Next, I used the Rebuild tool to rebuild the curve with a higher CV count so it would have enough resolution to support the curled shape. After that, I applied a Curl modifier to achieve a clean and evenly curled curve. Now I could convert it into a guide.

To keep the hair strands separated from each other, I gave each curl its own region and turned the region mask up to 1. I also added a Clump modifier, set it to Guide, and turned on Use Control Maps to ensure it split the hair according to my painted regions. 

After the first Clump modifier, I added a different mix of noise modifiers to break up the strands and add flyaway strands. 

I probably should mention that I never liked working with xGen since it was not really made for Maya and often crashed, but I found a tutorial by Liam Drain on YouTube that really helped me and made working in XGen so much easier for me, especially since I was working with so many different maps to control the hair.

Before that, I would paint my hair masks directly in Maya or paint them in Mudbox, convert them into .ptex and then load them into my hair collection in the XGen folder of my project. However, this process was often very buggy, complicated and often caused Maya to crash. 

Instead, paint your map in Mudbox, export them as a .png file, and save them in a subfolder in your SourceImages. Go back to Maya and click the Create Map button in your XGen Description. It is important that the model has the Lambert material assigned to it, otherwise, it does not work! After you click Create Map, Maya automatically starts the 3D paint tool, so you can paint your maps but don't paint anything. Instead, open your Hypershade and go to Textures. You will notice that a new file was created. You can now put your painted map into this new texture slot, click the little save icon in your XGen Description, and XGen will generate the hair according to your .png maps. 

Of course, everyone has their own way of working, but I wanted to share this process since it stopped Maya from crashing while still having the freedom of editing my maps in Mudbox or Photoshop afterward. 

Rendering

I tried out a few different light rigs and came up with two I really liked. The first was a single spotlight from the front with sharp shadows. I gave the image a very clear reading and displayed all the details in each piece very nicely. The second light rig was a collection of many small V-Ray Rect Lights, I was trying to keep it close to the concept and set a more dream-like atmosphere. I decided to go with a mix of both and made two renders of the scene, one with each lighting. 

I saved each render as .exr with all my different channels. Some of the most important ones were the Depth and Cryptomatte channels as I could use them for setting up my masks. I then brought the render into Nuke and started with compositing. 

I mostly did some color tweaks by adding more red reflections on the gold parts to make them match the concept. I then overlayed the two renders and added and masked them out to add some highlights in the areas I wanted. Another thing I added was the background with the tree, the fog, and some glowing particles. 

The turntable was created much later. I thought it would be sad to have only one render after putting so many hours into this project. I created a simple camera rig by grouping the cameras and then rotating this group around the mesh. Since this project was actually never meant to be displayed from any other angle than from the front, I had to make some smaller changes, like moving some objects to fill some gaps, but, luckily, nothing problematic. My biggest problem was the sheer size of the render! Each frame could take up to 4 hours, and I needed at least 150 to get a nice result. I never could have done this on my own PC, but, thankfully, I was able to use the PC on the Gnomon campus as my personal render farm. For over a week, I was running around the whole campus, setting up my renders, hoping I might be able to get a few frames done before someone else logged me out. 

Conclusion

I tried to get everything done within one term, which is 10 weeks. I planned to spend 5 weeks modeling and the next 5 – texturing, rendering, and compositing. I finished everything on time, but I still spent another week after the term was for polishing some last details and experimenting with different compositions in Nuke. 

One of my main challenges was the hair since it took me a lot of time and effort to find a technique that worked for me. Besides that, it was difficult to keep my scene and project folders organized. This project contained so many assets, and after exporting and importing them back and forth between Maya and ZBrush, I often had multiple versions with very similar names because at that moment I was too busy to bother about naming conventions and keeping my folders organized. So, not surprisingly, I lost track at some point. That is something I want to work on in the future.

I also started to panic at some point as I was scared I would not be able to finish my project on time. So I sat back and wrote down a list of all the things I still had to get done. Having the list helped me get a better grasp of my process and make sure that I did not forget anything. It also gave me some comfort as I could see all the things I had already crossed out.

So my advice would be to have a plan of what and when you want to have finished and keep your projects organized as it will help you while you're working and orient you when you come back to it at a later time. And if you are ever stuck at some point and don’t know how to fix it, even after giving it multiple tries, take a step back and work on something else that needs to be done. So instead of wasting the rest of the day on this one thing that you can't do, focus on something else you can do. It often helped me fix the issue by working on it the next day with a clear head and a fresh perspective. And if you keep struggling, ask other artists for help and feedback. Once again, I want to thank my former instructors, Tran Ma and Miguel Ortega, who have been incredible mentors and helped me grow as an artist! 

Claudia Luehl, Character Artist

Interview conducted by Arti Sergeev

Join discussion

Comments 1

  • Anonymous user

    Awesome write-up. As one of the original authors of XGen at Disney, I can assure you that it was indeed "made for Maya". It's "core" lives outside Maya so that it can work inside various renderers without needing Maya, but the only UI for XGen is Maya and it was designed there from the start.

    As for the constant crashing... That wasn't an issue at Disney! ;)

    0

    Anonymous user

    ·2 years ago·

You might also like

We need your consent

We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more