Character Behavior Programming & Game Optimization in Unreal Engine 5

AIKODE's developer ACE has returned to 80 Level to tell us more about how the NPC behavior was programmed, explain how the recently shown scanning effect was set up, and share some insights on the game's optimization.

In case you missed it

You may find these articles interesting

Intro

Hey there, it's ACE, the Solo Developer behind AIKODE.

Been keeping pretty busy since the previous article and demos dropped. But you know what they say – a picture's worth a thousand words. So, here's a sneak peek trailer I recently showcased. It's like a highlight reel of everything I've been working on since our last chat:

Getting Started With Unreal Engine

I first stumbled upon Unreal about two years ago, around the time I started developing AIKODE. Before settling on Unreal, I dabbled with a couple of other game engines. But it was when I really started putting Unreal Engine to the test – not just for animations but also digging into things like physics, lighting, and post-processing – that I realized it was tailor-made for what I had in mind for AIKODE.

And to be honest, It felt like Unreal was popping up everywhere – from casual chats at my university to YouTube videos and even discussions with my friends. At the outset, I was a bit hesitant about making the switch. Learning a whole new engine from scratch seemed like a daunting task. Looking back, though, going with Unreal was a smart move that seriously leveled up my game development skills.

Mastering the Software

When it comes to learning Unreal Engine, I've mostly been a hands-on learner. Trying things out, failing, and learning from my mistakes. There are a bunch of beginner-friendly videos out there, and the community on the forums is pretty lively. If you're dealing with basic stuff, finding help isn't too hard. But when you're tackling complex, head-scratching challenges, you've got to put on your thinking cap and mix your own creativity with the nuggets of wisdom you find in those forums. Sometimes, you need to combine different solutions and information to get the desired result.

I've also been lucky to have the awesome support of Zahid Ali Jeelani, the developer behind ENENRA, and Astrum Sensei, the developer of Workplace of Madness. These folks are not just friends but real gems of knowledge, and I've soaked up a ton from our interactions. Plus, I've also been able to return the favor by helping them out with stuff like optimization, art, and animation tips.

Having the right crew around can totally be a game-changer on your gamedev journey. So, don't hesitate to ask for help, even if the person has a lot of followers or looks like they've got a lot of things going on. You might just be surprised by how friendly and willing to help they can be!

Programming Character Behavior

Shaping character behavior in AIKODE is a complex process that uses a lot of tools.

The main characters in the game use root motion, which is a technique that allows the animations to drive the movement of the character's root bone instead of relying on Unreal's physics engine. This gives the animators more control and precision over the character's locomotion and interactions with the environment. The gameplay animations are created using keyframe animation in Blender and then combining everything in the AnimGraph, a system that allows you to create complex animation logic for your characters using nodes and graphs. It's really easy to keep everything organized in Unreal Engine 5 since you can use State Aliases, a feature that allows you to group multiple states into a single alias and then use that alias as a condition for transitions or blends. This way, you can simplify your AnimGraph and avoid repeating the same logic for different states.

I also use Masks, additive animations, and cached poses to add more detail on top of everything. Masks are nodes that allow you to isolate certain parts of the skeleton and apply different animations or effects to them. For example, you can use a mask to make the upper body follow a different animation than the lower body. Additive animations are animations that are added on top of another animation, modifying the final pose. For example, you can use an additive animation to add leaning to the flying or the running or different poses while aiming when shooting.

Cached poses are nodes that allow you to store a pose at a certain point in the AnimGraph and then reuse it later. For example, you can use a cached pose to blend between two different poses smoothly.

Here we can see an example of Aiko's AnimGraph:

Using root motion in a multiplayer game is not a good idea because it can cause synchronization issues and latency problems. But in a single-player game, it adds quite a few layers of realism and allows the animators to "direct" and control the character's movement.

Now, NPCs are pretty different. The main enemies, for example, are all set up through Blueprints – no behavior trees. Blueprints are very flexible and powerful and can be used to create anything from simple triggers to complex AI systems.

But the crowd NPCs, you know, the folks you see strolling around the cities? They take a cue from Unreal's Matrix Demo and the plug-in Mass AI system, which is a tool that allows you to create large-scale crowds of autonomous agents that can navigate complex environments. With this system, the NPCs follow these preset paths through the city while smoothly dodging obstacles, like the main character or other NPCs. It's kind of like giving them a mind of their own.

I combine Mass AI with my NPC randomization blueprint, a system that allows you to create diverse and unique NPCs. It works by using booleans and datatables to select different values for each NPC, such as gender, race, hairstyle, clothing, accessories, etc. These values are then used to modify the appearance and properties of the NPC using the construction script. For example, you can use a boolean to decide the gender of the NPC, then use a datatable to select a random name, hairstyle, and clothing for that gender, and then change textures and parameters in the materials. This way, you can create hundreds or thousands of different NPCs without having to manually create each one.

This system really helps when populating the cities of AIKODE because it allows you to create realistic and varied crowds of NPCs that make the world feel more alive and immersive. You can also use this system to create NPCs that fit the theme and mood of each city, such as futuristic, cyberpunk, steampunk, etc.

Here, you can see an example of this system. Keep in mind that the system isn't completely finished yet, and the end product will feature more age options, clothing choices, and materials.

And when it comes to animals and other unique NPCs that roam freely in the open world, I decided to combine behavior trees and root motion. Behavior trees are a graphical way of representing the decision-making process of an AI agent. They consist of nodes that represent actions, conditions, sequences, selectors, decorators, and more. By combining behavior trees with root motion, I can create animals and NPCs that have both realistic animations and dynamic behaviors.

In the end, every character is different, and you've got to use the right tools from Unreal's toolbox to make them tick just the way you want.

Thoughts on Barriers of Entry in Game Development

Blender is one of the programs I use. Nowadays, everything is definitely much more accessible. Tools like Quixel Megascans or MetaHumans, which I also use despite the "stylized" character approach, make a big difference.

But even though things are more accessible now, game development remains a complex undertaking. Unreal Engine and these tools surely make the process easier, but they don't automate it. They're more like aids that help teams or developers streamline some repetitive parts of their workflow.

However, I wouldn't say the barrier to entry is low these days. It's somewhat lower than before, but game development still involves a massive amount of work. As technology advances, player expectations rise, too. So, you can't settle for recreating what was done in the past. Nowadays, you need to aim higher and higher.

Combining Blender and Unreal Engine 

When it comes to environments, I usually have everything modeled in Blender, excluding small props. This means that I create the level design directly within Blender. While this approach might not be recommended if you're not intimately familiar with the character's movement, in my case, since I've programmed the movement, it's much simpler to craft it directly in a modeling application.

As I mentioned, in Blender, I create the initial layout along with the buildings.

After that, everything is imported into Unreal in different sections, and then I start the set dressing process with smaller props such as lights, benches, trees, and other similar details.

Regarding characters and other props, I always like to have a visual reference of how they will appear in Unreal. So, in Blender, I use my own shaders that allow me to visualize textures and other details. From a technical standpoint, the shaders are the same in Blender and Unreal.

The simplest part is importing animations. When an animation doesn't work as expected, it's as easy as doing a drag-and-drop in Unreal to reimport it. This tends to be a process I repeat multiple times. Additionally, Unreal has tools to make changes to animations, including their speed or editing curves in facial mocap. So, I can always make small changes in Unreal. However, the quickest route is to reimport it from Blender since Unreal streamlines the import process, although it's crucial to understand that units of measurement and even axis positions differ in each program. Therefore, you need to set up your Blender scene and export/import settings to ensure it works seamlessly. Once you've done that, transitioning from Unreal to Blender is quite straightforward.

I also work using Rokoko Studio (with the Smartsuit and Smartgloves) for capturing motion capture data, as well as the Live Link Live application for facial mocap. Simultaneously, I use other programs like Substance 3D Painter. So, I'm quite used to switching between software constantly and crafting a smooth workflow that functions seamlessly.

Setting Up the Scanning Effect

The scanning effect is a feature that allows you to see more information about the world and the characters in AIKODE, revealing hidden details and hints. But it's actually quite straightforward to execute, with the visual aspect being the trickiest part.

To create the scanning effect, you need to set up a flip-flop node in the blueprints. A flip-flop node is a node that alternates between two outputs every time it receives an input. In this case, the input is the button that activates the scanning mode. The first output activates a post-processing shader (a shader that applies an effect on the final rendered image) using a timeline (a node that allows you to animate a value using curves), followed by triggering a boolean (a variable that can have only two values: true or false) named Scan.

The post-processing shader is the core of the scanning effect. It creates a mask that expands from the position of the camera to "the end" of the world, covering everything in its path. The mask is used to modify the appearance of the objects behind it, such as adding a grid, an impulse, a chromatic aberration, or even outlines on the buildings. 

The mask is also used to control the visibility of certain elements in the world, like extra UI, character information, mission hints, etc. These elements are only shown when the mask covers them, and they are hidden when the mask moves away from them. This creates a dynamic and interactive scanning effect.

The boolean named Scan is used to keep track of whether the scanning mode is on or off. When it is set to true, it enables the post-processing shader and the extra elements, controlling the visibility of certain elements in the world, like extra UI, character information, mission hints, etc. When it is set to false, it disables them. The boolean is also used to communicate with other blueprints or scripts that might need to know if the scanning mode is active or not.

If you press the input button again, the second output in the flip-flop node deactivates the post-processing effect through another timeline and turns off the Scan boolean, restoring everything to its normal state.

Another similar effect can be observed in this video:

Here, I showcase the "city generation". It's quite similar in essence but significantly more complex, as it also involves the combination of level streaming, a technique that allows you to load and unload parts of a large level dynamically, depending on the location and direction of the player. 

Optimization

Optimization is my favorite part, and I use a multitude of tools to ensure it runs smoothly. AIKODE runs perfectly on Ultra settings at 1080p, achieving 30-50 FPS on a laptop equipped with a GTX 1060 GPU, 8GB of RAM, and an i7 processor. Considering this, it's evident that optimization was one of my top priorities.

To start with, AIKODE doesn't use Lumen, VSM, or Nanite, despite these being incredible tools that ease the work of developers. However, employing these tools can also result in the game not performing as intended on mid-range to lower-end systems and even on older-generation consoles. Hence, disabling these options was the first step I took (though this heavily depends on your project and objectives).

In the cities and the overall game world, Instancing and LODs are heavily utilized. Instancing is a method of rendering multiple copies of the same mesh using a single draw call. This reduces the CPU overhead and improves performance. LODs stand for Levels of Detail, which are simplified versions of a mesh that are displayed at different distances from the camera. This reduces the GPU load and improves performance. On the other hand, collision detection is kept as simple as possible (complex collisions can lead to significant issues).

Another technique involves the master material for buildings. AIKODE's buildings have a lot of materials and detail, often having around 12-16 materials per building. This can lead to draw call problems. To keep this simple, a draw call is a command that tells the GPU to render a batch of geometry with a certain set of state changes. Therefore, too many draw calls can slow down the rendering process and affect performance. To address this, a material was created that consolidates all the textures and materials into a single material, essentially incorporating a Material Atlas function within the material itself (this material was based on the work by PrismaticaDev). In a more technical explanation, a material atlas is a texture that contains multiple sub-textures that can be accessed by using different UV coordinates. I highly recommend employing this approach if your scenes involve numerous materials – your GPU will thank you.

Another interesting trick relates to rotating objects. Some cities feature hundreds of rotating objects simultaneously. The conventional approach for this involves Animations/Skeletal meshes (though these can impact performance) or utilizing rotating components or tick events (both ideas struggle when dealing with hundreds of rotating objects). To tackle this, I developed a shader that imparts a visual rotation effect on the object – it appears to rotate, while in reality, it does not. A visual trick similar to the movement of plants with the character or the wind. 

I delve into this in much more detail in an interview I conducted on the Unreal Engine channel:

The Roadmap

Right now, I'm deep into crafting a demo that should be ready soon. And I'll be going full-time on this project now that I've wrapped up university.

What's next? Well, I'm gearing up to hit as many events as possible to let people experience the gameplay firsthand. That's a big step for me.

Looking ahead, you can expect some exciting things next year, including a more ambitious trailer than anything I've done before, which will give you a better taste of the voice acting, music, and the game's story. So, there's a lot to look forward to!

ACE, Game Developer

Interview conducted by Arti Burton

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more