Stef Bracke talked about the Azurean Depths project, sharing how the underwater environment was created with Unreal Engine's PCG and detailing the creation process behind caustics, lighting, and post-process stages.
Hey there! I'm Stef Bracke, a Technical Artist from Belgium. My journey into 3D art began at Howest, where I earned my degree in Digital Arts & Entertainment. I specialized in Game Graphics Production, with a focus on technical art. Except for some modding in Warcraft 3 when I was a teenager, I had no prior experience.
During my academic years, I found myself gravitating towards efficiency, always on the lookout for workflows that cut down on time, especially by automating repetitive tasks.
When Houdini made its way into our university curriculum, I was immediately hooked. It was clear that this software is extremely powerful, and I also found it intuitive to work with.
While I love working in Houdini, not every studio uses it, so I branched out. I decided to put more time and energy into learning different software packages, as well as focusing more on fundamental knowledge.
The launch of Unreal Engine's PCG was a golden opportunity to adapt Houdini-inspired workflows to a new setting. During my internship, I had a lot of time to learn and experiment with it. During that time, I also dove deeper into Blueprints and C++.
The Career of a Technical Artist
Technical Art is a bit of an umbrella term. There are many different branches, and the larger the team, the more specialized a Technical Artist becomes. Both a solid foundation in art and a technical background are crucial. At its core, it's about problem-solving. While I am just starting out myself, I am happy to pass along the knowledge and advice that was given to me.
A Technical Artist must be in perpetual learning mode. That makes the ability and willingness to learn most likely the most valuable skill to become a successful Technical Artist. On top of that, strong communication skills are essential. There's often a divide between what's asked for and what's truly needed, and bridging that gap falls on the Technical Artist.
While the internet brims with valuable, free resources, I can't single out a comprehensive playlist that covers all you need to know. A recommended starting point is diving into documentation while working. Knowing what is happening under the hood will help build up some sort of fundamental understanding, which will help in the future. Your specific interests also guide your learning path. For example, for shaders, Ben Cloward has good videos available for free. Furthermore, online GDC talks offer insights, showcasing how developers accomplished a particular outcome or tackled a challenge.
My own experience as a Technical Artist has been focused on researching technologies within Unreal Engine to determine their feasibility for project implementation. This research led me to test the potential and limitations of Chaos Physics for environmental destruction, as well as Procedural Content Generation within Unreal Engine. Finally, I have documented these findings for future reference.
I first dipped my toes into Unreal Engine about two years ago. Initially, I used it for environment creation, spurred by a university module.
Looking back, I found it so daunting back then when it's now my go-to software. I started out in 4.27 but it didn't take long before I transitioned to 5.0 Preview, and I have been on version 5 ever since. I found it challenging to work with at first, but as I gained experience, it became easier over time, just like with any new software. A game engine is always a bit more complicated because it needs to be able to perform many different tasks. Ultimately, my motivation to learn it was driven by the goal of completing a project. It's a cycle of stumbling, troubleshooting, resolving, and repeating.
The resources that helped me learn, other than university modules, were YouTube tutorials, ArtStation Learning, Unreal Engine's Documentation, as well as conversations and asking questions to other people.
After researching PCG, I was hoping to use this knowledge to create a portfolio piece. While discussing this with Tatiana Devos, we arrived at the idea of making a coral reef together. I would take care of the technical aspects and she, as an Environment Artist, would take care of the assets and the visuals. I think one of the inspirations was making a natural biome that hadn't been made procedurally before, as far as I know. Forests are very common, and I wanted something that would stand out a bit more. Another factor was that this type of environment lent itself to all sorts of possibilities and challenges. Getting a convincing underwater look, shaders, and particles, to name a few things. I did research into boids for fish school movement in Niagara. Not every idea ended up getting implemented, as a project gets easily bloated, and there were only two of us working on it.
Unreal's PCG toolset
The Procedural Content Generation Framework (PCG) is a comprehensive toolset designed for creating your own procedural content and tools. It offers users the flexibility to craft tools and content with varied complexity. It can be anything from utilities for assets, such as biome generation or buildings, to the creation of entire worlds. As PCG is still relatively new, the amount of learning resources available is scarce. I'm sure this will change in the future as the plugin receives more updates and its users grow more familiar with it.
I found Epic Games' Tool Programmer Adrien Logut's YouTube videos most helpful. Besides that, it was valuable to have experience with Houdini as the underlying logic is similar. It's a matter of figuring out the names of corresponding nodes and understanding the limitations.
For the workflow, I looked at the sample project by Epic Games, Electric Dreams, and how they set things up. I learned a lot from the project and custom nodes they had made. To walk you through how the network is set up: all the generation follows a spline-based workflow. There is a single spline driving the generation of both the rocks and the ocean floor. The rocks are spawned on the spline while the ground is spawned on the interior of the spline.
For the ocean floor, first, we access the data of the Spline from the actor in the world. Next, the points are sampled on the interior of the spline. The bounds of the points get modified according to the size of the meshes that will get spawned; this needs to be fine-tuned a bit. After that, some more point filtering is done to limit the density of assemblies being spawned. One last thing before spawning the meshes is to give them a random transformation, a value on Z between 0 and 360 degrees.
As for the rock structures: just like before, we are getting the data of the Spline, but this time, we sample points on the spline instead of on the interior.
Inside the same graph, we branch off for the Kelp, but we use the same spline and initial steps as for the spawning of the rocks.
Distributed at random locations are some Niagara Systems spawning bubbles. These are branched off from the ocean floor because we want to access the same data (the spline interior). A harsh density filter is applied to limit the amount of systems being spawned.
After the main area is generated, there is the opportunity to hand-place some more assemblies or assets. There are also two additional PCG graphs that also place structures. The most interesting one here is the large assembly that also interacts with the rest of the environment, as it can spawn rock arches connected to the edge of the spline under the right conditions.
The assemblies, which are collections of instanced static meshes, were put together by hand by Tatiana. They differ in shape and size depending on where they are spawned. There is essentially a library of assemblies that get spawned at certain points in the network.
Creating the Underwater Effect
The underwater effect is created by a few different elements working together.
The Post Process Material
This is a special material (Material Domain: Post Process) that gets applied on the entire screen, as a final step in rendering. There are a few different things set up here in a specific order. The whole material is essentially UV manipulation stacked upon each other in layers.
The first step is a lens distortion effect that pushes out the center pixels. It gets masked by a radial gradient, so the edges of the screen aren't affected. Next to that, we add a new effect to the UVs that we are then able to simply add to the existing effect. This step is what causes the wavy water distortion. I pan two noise textures by multiplying time with vector 2 and plugging that into their UVs. This result gets masked by a spherical mask to prevent the UVs at the edges from being affected as there is no pixel information beyond the borders of the viewport which would cause stretching.
The final step is a radial blur effect that gets less intense the closer to the center of the viewport. This is achieved by adding multiple layers of UV-adjusted Scene Textures to each other and multiplying each one by a slightly altered value to achieve a slight offset. The more separate alterations, the higher the render cost but the better the quality. Since this process is basically doing the same thing repeatedly, I wanted to see if I could translate it to a for loop within HLSL, which would save a lot of space in my material network. This was a bit of trial and error but eventually, I succeeded. I experimented with a few different effects such as colors based on Scene Depth, but I found the effect to be too harsh and occluding too much of the environment.
The caustics are another material domain called Light Function. This type of material gets plugged into a Directional Light, and its visibility depends on the settings of the light. With the Directional Light pointing straight down at a 90-degree angle, it creates the illusion of caustics being cast upon the ocean floor. The texture of the caustics was made by me in Substance 3D Designer. Most of the effect is already achieved by panning the texture, but I added a bit of offset to the red and green channels to add some chromatic aberration.
I collected some components in a Blueprint to make them easily accessible and allow for drag and dropping into a new level for different lighting setups.
Exponential Height Fog
The fog serves to occlude the view in the distance. The Start Distance is kept low, while the Fog Inscattering Color is set to a fitting blue color. The fog is dense, and opacity is set to max.
- A bunch of extras that all add up together
- Slight temperature adjustment to make the overall view colder.
- Chromatic Aberration
- Dirt Mask Texture – This is a texture that you can overlay on your screen, like some splatters.
- Post Process Material
- Directional Light (Directional light with some cyan tint added. This also includes the caustics Light Function Material)
- Second Directional Light (Lower intensity, colored secondary light with adjusted Forward Shading Priority)
- Sky Light (Low-intensity skylight with a default Cubemap applied)
The Blueprint also contains numerous variables to adjust if necessary. I tried making it so that the visuals could be tweaked on the fly.
This is simply a large plane using a shader with multiple textures panning over each other in opposite directions to simulate moving water.
The workflow for animating the fish and the foliage is different. The foliage is animated through World Position Offset, for which we used a built-in SimpleGrassWind node. The base of the foliage is masked so that part stays still.
The fish are rigged and animated in Blender. I then exported the animation to Houdini, where I baked it onto a texture. To use Vertex Animated Textures in Unreal, you will need a free plugin from SideFX. After enabling the plugin, it's as simple as setting up the Material to work with VAT through the built-in material function. The way vertex animation textures work is that each row represents a vertex position, with positions stored in the RGB channels. Each column represents a frame. By assigning values to the corresponding frame/vertex position and looping over them, we can achieve movement through world position offset, which is a much cheaper way of animating large amounts of meshes.
It would be hard to quantify in hours, as I didn't track the time. It was worked on – on and off – over the summer.
Some of the challenges were bringing the whole project together and deciding at some point to cut out certain features. Working with GitHub as source control was also challenging at some stages.
As I had laid out a framework for PCG from the start, I had to go a bit back and forth once assets were coming in, as some adjustments had to be made.
Some other challenges for me personally have been staying on track with the main components to help put the scene together. I would find myself wanting to work on technical tidbits that seemed like they would be interesting to have. Ultimately, most of these didn't even end up making it into the scene as they were left in an unfinished state.
As a professional, I am also just starting out, but I can help advise students who are considering getting into Tech Art. Try to figure out what you enjoy doing the most and steer yourself in that direction. Each topic is a rabbit hole that goes very deep, and to distinguish yourself, it would be good to excel in one. The internet is full of free resources, so make use of it. Connect with other technical artists and talk to them, learn from each other.