Tian Qiu takes us through the process of creating ocean water splashes in Houdini and shares important advice for those who are planning to work in VFX.
In case you missed it
You may find these articles interesting
I am Tian Qiu, known as Ocean, a Senior Visual Effects Artist/Houdini FX TD based in the Greater Los Angeles area. When yeas ago I was majoring in Animation and Visual Effects at the Academy of Art University, San Francisco, I quickly realized the power of Houdini and the importance of the procedural technique, therefore I chose this aspect as my main study focus and career direction. In the course of many years after my graduation, I have been working on many award-winning projects across television, film industry, autonomous driving, and video games, using the procedural technique and mindset.
While at Encore VFX in Los Angeles, I was the Effects Technical Director for some DC hero shows such as Flash (seasons 5, 6, and 7), Supergirl (seasons 4 and 5), Doom Patrol (seasons 1 and 2), Batwoman (season 1), Black Lightning (seasons 1 and 2), Netflix’s Virgin River, etc., as well as the Visual Effects Artist for Arrow (season 7). Those shows had a great DC fan base and each season has at least a million streamers watching and has won many awards including Saturn Award for Best Superhero Television Series, Leo Award for Best Visual Effects in a drama, etc.
I sometimes will translate the concepts and references of the DC comics, that are whether well-crafted, rough, or somewhere in between, into the realistic visual effects that match the look of the show. Creating the effects for the characters, doing explosions, and blowing stuff up is also a lot of fun. I have learned a lot in production these years while crafting a good visual composition for VFX shots. In two years, I have done 400+ shots and that is the TV speed. Cannot even imagine I was able to do that back then. Meanwhile, my leads and supervisors are all very nice people. No politics in my department and they all are willing to teach me and show things around. I really love that culture in the working environment.
During the pandemic, I started to work remotely. I was contacted by Kevin McNamara, the CEO of Parallel Domain, a start-up company creating autonomous driving training. I was helping him create the procedural residential areas of the city, in order for the electronic vehicle’s AI to be trained in the 3D world. That mindset is really awesome. Who could think of adding a procedural CG city into the autonomous driving training? It is pretty reasonable at the same time. It is always less risky to have autonomous vehicles get data in the virtual world rather than in the real world where they get into all kinds of accidents.
From that experience, I started to work on real-time visual effects in AAA games with my procedural skills at Deviation Games, using Houdini and Unreal. If you ask me if a Houdini FX TD/Artist could work in the game industry, my answer is yes, often with great success, like in my case, though you might need up to 6 months of retraining.
Getting Hired in the VFX Industry
While I was still in school, I realized how difficult it was to get into the VFX industry. It was not the time that VFX studios could not find VFX people at all and were eager to hire students and recent graduates. That time had passed long ago. For a student, their portfolio has to stand out, be appealing, and even be as professional as the seniors' portfolios. Those supervisors and leads in the studio will pick you out of an entire ocean of candidates. Unfortunately, the graduates are also competing with the seniors having more than ten years of VFX experience. You want to be one of the best people in the industry, at least have this ambition.
I am not the type of person good at networking, which is still an important skill though, so that people can refer you and get you an opportunity. I had to aim at learning senior techniques from the senior Houdini artists at Odforce, or Rebelway, or other communities. I decided to really work on building up my portfolio and find an internship. I ended up getting an internship as an FX Artist at Digital Domain China, I had been working there for three months on a TV show. And yes, gain experience wherever you can, in a foreign country or not, it is still a great start. After I graduated, I sent my showreel and resume to many companies including Encore VFX. They interviewed me and brought me to the team really quickly. That is how I officially got started.
Speaking of my favorite show, I would say it is Doom Patrol. I was basically hired for this show at Encore. Not only is this show well-scripted, but it also has given me many chances to prove myself. There were various kinds of simulations and shots they gave me to work on, not just sparks and smoke, but some complex destructions, character effects, and all kinds of procedural tools and assets. I have some free space to work and show my effects as well. In my reel, you can see quite some shots from this show as well. It means a lot for me in multiple ways.
Creating the Beach in Houdini
The beach simulation has been an RnD when I had a water shot for the show called Supergirl. They wanted me to create ocean waves hitting onto the cliffs, so I started to do the tests. Meanwhile, I was re-creating my water simulation pipeline/workflow, since Houdini had some major water updates in the latest versions. The team did not have a hi-res water example nor had it done an RnD like this. My task became making it look realistic, establishing the new workflow. The only reference I had was a shot from above from the previous season of Supergirl. I took Sabor Jlassi and Igor Zanic’s rock splashes RnD as my technical references. I wanted to create simulations as nice as the ones they showed me, which became my goal.
With any water simulation, the world scale comes first. Beach simulation is considered a big-scale simulation in Houdini. Before setting up the simulation, I usually want to define how big the simulation area is, how deep the water is, and put a T-pose character there. Water simulation can be very different depending on the scale. I have also created a procedural ruler here. For this simple scene shown below, I have spent quite some time establishing the scale.
After creating a wave tank setup from the shelf tool (which is convenient and nice since it sets up the wave workflow and the narrowband for us), I started to define the wave-guided geometry using Ocean spectrum, Ocean evaluate, and a high-resolution grid. This geometry could be used for deforming our wave tank source, computing the velocity from this wave geometry, and injecting it into our flip simulation as a VDB volume source. The position of the geometry is right on the water level (not used for collision though).
Here we might have a lot of back and forth to make the wave geometry right by measuring how it actually behaves in our flip simulation. The source of the flip fluid is vitally important. In the current scale, here is my ocean spectrum setup that is working nicely:
For the wave tank node, I have ocean spectrum as my first input and it can guide my wave tank the way that the spectrum defines; we have collisions as our second input, for deleting extra points inside the collision. I use ocean spectrum node for deforming the grid, trail node for computing velocity, and VDB from polygon for converting it to velocity volume.
Here is what this wave tank source should look like:
In the FLIP Solver, after wedging a lot with the whitewater settings, I feel only a few settings are enough to make the beach flip simulation beautiful: Velocity Smoothing (0.01, default 0.1), Particle Separation (0.09), Add Vorticity(on), Surface Extrapolation (0.3). Velocity Smoothing is literally smoothing out velocity variations in the waves. Having this value low will get you a detailed look. If you are creating rock splashes, then “detecting droplets” will get you the droplets attribute later so that you can filter the particles separately by droplet threshold and create a good mesh. Since we are here talking about beach simulation, we do not have to have this on. More settings are in the below gif.
In addition, I used “popadvectbyvolumes” with Force Scale 1.5 and “Treat as Wind” off, in order to bring in our wave velocity volume I mentioned above. After adding the VDB collisions, my flip simulation looks like this:
Creating Foam and Splashes
For foam and splashes, there are indeed more aspects that should be taken care of than the flip fluid simulation. First of all, it is the source, how much depth, curvature, acceleration, and vorticity of the fluid I want to emit foam from my core flip fluid. These could be controlled and visualized by the node called “whitewatersource”.
Now we move towards the whitewater simulation. All that matters is how we are going to break the universal look of the foam. There are indeed some parameters to tweak. In the whitewater solver, I personally take “Repellants” (under the “Foam” tab) as a more important parameter to break off the universal patterns that we have in default. The foam clumping/neighbors could be accordingly adjusted, as it is responsible for how many neighboring particles you want to stay together. Also adjusting the age of the bubbles/foam/spray in “aging rates” is vital: you do not want them to disappear too quickly or too slowly. You do not want your points to explode at some point, so keep an eye on your foam particles number. Then for the “Forces”, they are all driven by depth, saying how much buoyancy/advection you want to see at a certain depth. The accurate setting of the solver is in the RnD video below:
Our current flip solver and whitewater solver has improved a lot and is much faster than what it was several years ago. I remember 4 years ago it would taking me 3 days to generate this much flip and whitewater, but now I do this within only a few hours. Be sure to take the advantage of the new techniques!
Briefly, I have prepared my water meshes, volume splashes, and rocks environment under the HDR lighting environment, for the last part of rendering. I have a dome sky HDR lighting and two direction lighting sources. The key direction lighting has warm orangish lighting, and the fill light has a bluish cool feeling.
In order to mesh the hi-res flip fluid without flickering it, I am using the surface volume inside our flip solver as a VDB base. The flip solver itself was using this volume as a part of the simulation, so it has rare flickering. On top of that, I am adding extra VDB of original flip points using “vdb from particle fluid”. And then I smooth the VDB according to the length of velocity and curl of the velocity from the surface volume (which could be computed using “vdb analysis”). After converting to mesh, I transfer our curvature data and the depth data in order to render them as some extra passes for shader tweaking or compositing.
When rendering the mesh, I add full reflection, some refraction with an index of 1.3, but also I add a bit of color diffusion, I found it could make it look better in this test. When it comes out straight by renders, it looks beautiful. I started to introduce a lot of passes, velocity, vorticity, position, depth shallow, and curvature passes, along with lighting passes like direct lighting and indirect lighting, in order for me to tweak later in the compositing.
The foam shader is using a simple volume model, that has cloud density, scattering, and shadow density multiplier attributes.
The trick for splashes is, you can convert them to volumes in SOP, or render them as volumes with shaders. Here I used the volume shader and that already gave me a decent look with our water mesh.
The final look after compositing everything in Nuke:
Fluid simulation is one of the most expensive visual effects, in terms of time, hardware, and talents who specialize in it. There is no way to get around the difficult parts, which are to get familiar with the solver behaviors and learning to render them well, that could already be a challenge. Many TDs will wedge literally every parameter in a solver to see what it actually does in an artist’s view. We all knew that under each parameter is a kind of math or algorithm doing its things, but as artists, we observe the behaviors and understand them in artistic form, and accordingly add them into our simulations. Watching other TD’s RnD videos could indeed help us understand those solver parameters and behaviors.
Simulation effects are mostly an art of velocity. As long as you control the velocity, as long as you know how to use noise to layer or break the universal pattern, you are able to manage the behavior. For rendering the splashes, we just convert them into volumes either through rasterizing or shaders, and it will look realistic. Lastly, make enough passes and that will save both you and the compositors a lot of time, to make it even better in the post-refinement for the shots.
From what I have heard, the big studios like ILM and Weta Digital want to see some expensive water shots in one’s FX reel, and that makes some sense because that tests one’s understanding of the fluid behaviors. What will be the advice? Keep doing experiments and keep learning. Nothing goes perfect under only one try. Not just beach simulation, but some splashes or river tests, those practices could help people understand the flip behavior and how to make it look better.
There is a lot of back and forth in this industry in order to make it perfect. The time we spend on 90% of works might be less than the time we spend on the last 10%. So keep changing the techniques you are using, and you will find one or come up with the one that feels comfortable for your own workflow one day.