The company has released a mind-blowing new demo, showing how a combination of generative AI, Unreal Engine 5, and NVIDIA's RTX and DLSS can be used to create lifelike non-playable characters.
During Computex 2023, NVIDIA released Kairos, a mind-blowing real-time demo that showcases how the powers of generative AI, Unreal Engine 5, and NVIDIA's RTX and DLSS can be leveraged to create lifelike non-playable characters (NPCs).
Providing a look at the future of in-game NPCs, the demo showed a stunning interaction between a player character Kai and a non-playable character named Jin, who is capable of replying to the player's spoken questions and providing coherent responses within just a few seconds, with the character's facial animations working perfectly alongside the generated replies.
As stated by the developer, the intelligent NPC Jin was set up in collaboration with ConvAI, a start-up working on a platform for creating and deploying AI characters in games and virtual worlds, utilizing NVIDIA's Avatar Cloud Engine (ACE) for Games, a suite of cloud-native AI models and services that make it easier to build and customize lifelike virtual assistants and digital humans.
The list of tools used for the demo included NeMo, which provided foundation language models and model customization tools for tuning the game characters, Riva, which provided automatic speech recognition (ASR) and text-to-speech (TTS) capabilities to enable live speech conversation with NVIDIA NeMo, and NVIDIA Omniverse Audio2Face which was used to instantly create expressive facial animations for Jin from just an audio source. The listed models were then combined with ConvAI services platform and fed into Unreal Engine 5 and MetaHuman to bring the immersive NPC Jin to life.
To set up the environment itself, the team used the original NVIDIA Omniverse-made Ramen Shop project, showcased last year, and made it look more "cyberpunk", changing the textures directly in the engine and creating additionall assets with Blender and Substance 3D Painter. Furthermore, the team used the NVIDIA RTX Branch of Unreal Engine 5 (NvRTX 5.1) to set up the scene, as well as RTX Direct Illumination (RTXDI) and NVIDIA DLSS 3 to render and light the environment. According to NVIDIA, it took them just 2.5 weeks to create the project from start to finish.