Reanimating Ghosts in VR with Character Creator, iClone & Unreal

VR Developer and Lighting Artist Raqi Syed discussed the workflow behind Raise Ravens, an animated docu-memoir where she digitally resurrected her deceased father using Reallusion's Character Creator and iClone.

Introduction

I began studying animation really early on in my education. I was lucky enough to go to public schools in California that taught animation as part of Art curriculums. So I made my first stop-motion animated film in Junior High and then made several more throughout High School. I then went to film school at the University of Southern California, where I first studied Film Theory and then did an MFA in Animation and Digital Arts.

After finishing school, I worked at Disney Feature Animation Studios and then later at Weta Digital. I’ve worked on both traditional animated features like Meet the Robinsons, Bolt, and Tangled. I’ve also worked on visual effects films like Avatar and The Hobbit, which I essentially consider animated films as well.

I started using the Reallusion tools in 2018 while creating characters for the virtual reality experience MINIMUM MASS. Over the next few years, I began to use Reallusion’s Character Creator in my courses to teach lighting, rendering, and design ideation. Teaching lighting used to be really challenging because you must work with high-quality characters and assets to achieve an understanding of the discipline.

The main benefit for me and my students is to be able to procedurally create characters that are production ready. Character Creator’s Headshot plug-in is especially powerful because it enables the design of bespoke characters while leveraging the streamlined procedural tools that already work so well in Character Creator. 

The Raise Ravens Project

RAISE RAVENS is an animated docu-memoir told in virtual reality. The participant enters virtual reality and summons the ghost of my dead father. Through encounters with artifacts and audio recordings of my family, we learn about my father’s memories and secrets. The project is about reckoning with our past in order to end intergenerational hauntings and lay our family ghosts to rest. My goal for this project is primarily to tell a truthful and interesting story. Secondarily, I want to create believable characters that propel the story forward and to whom audiences can relate.

Recreating the Face

Recreating my face in Character Creator was very simple. I began with a photograph of my face. It’s important to take a photo that is well lit with soft diffuse lighting to eliminate dark shadows. The camera angle should be frontal and the lens should not be too long or too wide – I used an 85mm Zeiss prime lens on a Red Dragon camera to take a 4k photo. The camera itself and the brand of the lens are not relevant.

The important thing is to begin with a high-resolution photo that is not distorted so Character Creator can generate the best quality mesh. I also suggest taking reference photos from the side and 3/4 angles. Headshot is a single-click process that will only use the front view. However, the additional angles are a great reference for making adjustments to the face using Character Creator’s extensive sliders. You will want to use both tools – the auto mesh generation using Headshot and the head morphing sliders – to ensure an accurate result.

The hair was also straightforward. I used the Smart Hair embedded system in Character Creator. At the time I created the character I had a bob, so I was able to use separate components like a fringe, cropped ear-length hair on the side, back section, and a top hair accessory to get the look right. My hair was also dyed a blue-black color which was achieved using the Smart Hair Shader.

The Skin Shader

What I like about the Headshot tool is that the original photograph is used to generate the Albedo/Diffuse Map for the skin shader. If my original photo is properly lit, then I can maintain the character’s natural skin color when it is converted to a texture. At the same time, I can also go in and, using SkinGen, dynamically adjust and refine the character’s skin.

With dark skin, for example, I want to bump up the specular contribution and bring down the subsurface. During this process, I can go into Character Creator’s Content Manager and bring in a few different lighting setups. I want to make skin adjustments that work in “neutral” and specialized lighting conditions. This is really easy to do using the existing tools.

Utilizing iClone for Animations

When building, testing, and lighting a character, it’s super useful to have animation cycles like an idle walk or relaxed pose. Most character pipelines keep a character in a T or A pose which makes it difficult to judge what a character will actually look like in a narrative context. I was looking in particular for naturalistic motion.

Most online mocap libraries contain action-oriented poses that are dynamic but not useful for dramatic storytelling. For RAISE RAVENS I will eventually cast an actor and use motion capture to generate performance. But because I am building a prototype for an emotionally driven story, I want all my tests to convey a sense of understated performance during development.

iClone 8 comes with a variety of animation cycles that are useful in the project’s development. The same is true for the iClone LIVE LINK plug-in for Unreal. It’s great to be able to use an iPhone to quickly capture and prototype facial performance on a character without committing to the overhead of a traditional facial capture pipeline.

Combining Reallusion Tools & Houdini

The main reason we use Character Creator in my classes is that once we create our characters, they become stand-alone assets. We can write the character out as an FBX and import that into any DCC. In my Lighting and Rendering class, I begin by teaching traditional offline rendering technologies, specifically in Houdini’s lighting context, Solaris. We need to be able to work with a high-quality asset to leverage renderers like Arnold, Karma, or Renderman and learn about physically-based rendering. We also use our characters and their iClone motions to redesign and adjust the motion procedurally using Houdini’s KineFX animation tools. 

For RAISE RAVENS I have been using Solaris to work out style frames. Solaris is great for iterating on lighting and creating multiple set-ups. Once I have my character with motion, I can dig into look development and further revise the story. My development process is to loop through these phases multiple times. A simplified motion pipeline helps with this a lot.

Conclusion

Setting up characters in a more traditional way would be an order of magnitude more expensive and complex. Character design and creation touch on nearly every skill and work area within tech art. It’s an enormous undertaking. In professional studios there are artists who specialize in one aspect of this process, like hair grooming, modeling, rigging, texturing, costume, or animation. We live in an amazing time where all these specialties have been condensed into a single package that allows artists to quickly move through all these steps in order to focus on storytelling.

Raqi Syed, VR Developer and Lighting Artist

Interview conducted by Arti Burton

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more