Making Physically-Aware Audio System for Penguin VR Game in Unity

Zi Ye told us about their game development journey, discussed how the penguin VR project came to be, and showed how the physically-aware audio system was created.

Introduction

Hi! My name is Zi Ye, and I'm a VR game developer. I've been making games for about 20 years, and VR games specifically for about 8 years. While at Owlchemy Labs, I worked on games such as Job Simulator, Rick and Morty VR, and Vacation Simulator. Over the years, I've worked on many fun VR side projects. I am currently working with a small team on an upcoming indie VR title about penguins!

Game Development

I first got into gamedev when I was in middle school. Up until that point in my life, I had wanted to be an artist or musician, so my first foray into gamedev was through 3D modeling and digital music composition. Eventually, it became clear to me that programming was a requirement, so I picked that up as well.

I was one of those kids that didn't really like to go outdoors, and so most of my spare time was spent learning, practicing, and experimenting with the various disciplines involved in game development. I was self-taught, learning everything by studying online tutorials, lurking forums and IRC servers, and reading reference books.

First Projects


My first projects were incredibly ambitious (as most first gamedev projects are). I wanted to make full-fledged 3D games with story and voice acting, despite having neither the ability, time, nor money to do so. I was also into making my own game engines. My biggest inspiration was definitely The Legend of Zelda: Ocarina of Time.

None of my early projects were ever completed. Nevertheless, all that stubborn pursuit of the impossible made me encounter some of the most daunting aspects of game development very early on, which definitely gave me an edge in later years.

My big switch to Unity came at university, it was for a project in a game development course. After years of working with OpenGL, DirectX, OGRE3D, and XNA, at some point, Unity just became a no-brainer if I wanted to actually finish a game project without getting bogged down in the backend. That school project eventually became my first indie game, Parallax.

I first got into VR in 2015 when Devin Reimer, a co-founder of Owlchemy Labs, invited me to his basement out in the middle of nowhere to "check something out". It turned out to be an early prototype of Job Simulator, and I was thoroughly unimpressed, haha. It took a lot more convincing, but eventually, I joined Owlchemy Labs and stayed there for 8 years!

The Penguin VR Project

This project started out as an experiment with a full-body, physics-based penguin VR avatar. I wanted to see what it's like to be a penguin in VR. As it turns out, it feels great! So I did some more experimentation with some penguin-y mechanics like igloo-building and soon realized that what was missing was the world that the penguin lives in – an actual place.

With this in mind, I started creating some more fleshed out environments befitting a penguin, and from there, a vision of a game slowly started to form – a vision I hope to validate soon. And that brings us to today!

Physically-Aware Audio System

Until I made that system, there was no impact audio in my prototypes. This means that whenever I slapped my belly with my penguin flippers or smacked a hard object into another hard object or threw something at the ground, there was just... nothing! It was extremely disappointing and immersion-breaking. I mean, everyone knows belly slaps are a fundamental aspect of the penguin experience!

But there are just so many possible combinations of things-hitting-things that create different sounds, and it would be a nightmare for a tiny team like ours to have to manually enumerate them all. And so I built a system that makes some basic assumptions about the circumstances under which objects resonate, thus reducing the amount of work we have to do to account for new cases.

I achieved this in Unity by creating a custom ScriptableObject class called SurfaceTypeData. This allows me to create assets for different types of object "surface types", e.g., Metal_SmallDense, Wood_LargeHollow, Flesh_Small, etc. Each one of these assets serializes a list of AudioClips to randomly play from when that surface type is struck, as well as a "hardness", and a "resonance hardness threshold". This allows me to say things like "This surface is very hard, and only makes sound when hit by a hard surface".

Finally, I assign these SurfaceTypeData assets to SurfaceImpactItem components in the scene or on prefabs, and when they hit each other, we resolve the resulting audio events.

Tips for Beginners 

Make your fantasy come true. It might sound kinda cheesy, but I believe that in VR gamedev, even more so than in flat-screen gamedev, you have to be working from that personal, and special place. Because VR is so hard to do and even harder to do well, it is very tempting to fall back on formula, or pre-canned solutions. But that is how VR becomes stale in a time when innovation is paramount. The road is long and tough, but if you push through, you could make something uniquely awesome that nobody in the world has experienced yet.

Plans

I am currently working with two awesome people – Megan and Victoria – on a proof-of-concept. We hope to find out whether or not this penguin VR game idea works and how we can make it happen. We can't say much else for certain, but we should have an official announcement in the next couple of months, so stay tuned!

Zi Ye, Game Developer

Interview conducted by Theodore McKenzie

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more