Irradiation: The Development Process Behind a UE-made Film

A Director and 3D Artist Sava Zivkovic talked about the production process behind Irradiation – a short film made using Unreal Engine, Blender, and motion capture.

In case you missed it

You may find these articles interesting

Introduction

My name is Sava Zivkovic, I’m a Director based in Belgrade, Serbia. I’ve studied interior and furniture design at Belgrade Faculty of Applied Arts which is where I got introduced to 3D, and I’ve been completely immersed in making moving images ever since I first opened a 3D package back in 2008.

In 2017 I started my directing career at Axis Studios where I still work, and some of the projects I’ve directed include cinematics for Destiny 2, Gears of War 5, and Outriders. In my free time, I create independent films like IFCC, Playgrounds, Freight, and most recently Irradiation.   

Getting into Filmmaking

I’ve always loved filming, but it’s one of those situations where you never really know this is something you could actually do for a living. In my early days of freelancing, I was dealing with a lot of architectural visualization, which was a natural area of CGI to get into given my academic background. Roughly around my first year in, Alex Roman made his masterpiece called The Third & The Seventh which has me, still to this day, completely blown away. It was one of the catalysts that have driven me towards my own films with an architectural subject which were the first projects I’ve ever completed. After years of working with archviz and a bit of motion design, I felt I really needed to step out of that field a bit and do something character-driven. This eventually led to the directorial debut film I’ve made for the 2017 IFCC conference, and that’s what got me noticed by Axis Studios and what got me a director’s job.

Irradiation

I started thinking about the story roughly in November 2020, it was a lot of starts and stops since I was working on the film in my spare time. On the surface level, the film tells the story of Evgeniy, a scientist volunteering to examine a mysteriously irradiated site. But upon being escorted to the site, and after experiencing strange visions, Evgeniy finds himself questioning the very reason he was sent to investigate the site in the first place. On the deeper subtext level, the story is a very personal one, but I would rather leave the interpretation to the viewer rather than explaining what it means to me.

The inspiration largely came from some personal struggles in the past few years but it was visually driven by the Big Medium Small asset pack that I used, and the pack had naturally driven the film’s setting. A lot of the viewers had made connections to Chernobyl, Annihilation, Stalker, and Roadside Picnic, and although I can see how one would make those connections, those weren’t the catalysts for the film by any means.

Motion Capture

I’ve been very lucky to have worked with TakeOne in the past and this collaboration just furthered the technology and the creative aspect of shooting mocap. For this shoot though, we actually had Unreal Engine running in real-time on the stage, and it’s such a massive help both for the actors to see themselves in their final CG form, but also from the directing perspective too. We had a virtual camera which enabled me to frame up the shots in a completely lit environment and look at the performances from the camera’s perspective.

I wouldn’t say there were any challenges with this shoot honestly, TakeOne are absolute pros, they’ve been doing this for decades, and for the majority of the performances, they only had to clean up some geometry intersections. I’d say it was one of the smoothest shoots we’ve ever done.

Assets and Environments

That’s the true power of real-time engines such as Unreal, you’re not limited with render time and you’re not waiting on renders for weeks. Instead, you’re just reacting to what’s in front of you and you’re creating almost at the speed of thought. I do need to say that while the production in Unreal only took 4 weeks, the whole film with writing, shooting, planning, sound, and music took around 3 months. 

Also, the assets are the aspect that really sped up the production, but that was always part of the plan. All of the assets, except for the graphite blocks, are premade and available off the shelf here, and the environments came from MAWI with some additional Megascans.

As far as the process goes, getting everything into Unreal takes a bit of time, but once everything is in place you really make up for that time in production. The speed is one aspect of the process, but the creativity gained by working in a real-time environment is the biggest value for me and I personally think it’s completely unmatched in any other software at the moment. 

Unreal Engine in Filmmaking

The ability to see everything and not have to wait on renders, to be reactive to the performances and environments and the impact that has on cinematography is the biggest creative value for me. I’ve been trying to make images move for over 13 years and it always felt like a button-pushing technical nightmare, with Unreal it felt like I was on a set filming with a camera. It’s completely game-changing.

I’m still a novice in Unreal, and my knowledge is very basic. Essentially I know how to import assets, animated cameras, and move around lights. That’s about it. In this world of real-time workflows, your filmmaking style is what will set you apart so I would very much encourage you to learn the various aspects of filmmaking first.

I do have to mention a tool that is absolutely incredible though, I’ve made heavy use of Tyflow, a free plugin for 3ds max, which was used for all main FX physics simulations in the film. The main graphite blocks, branches for foot interaction, and blood FX were all achieved in Tyflow and exported to Unreal. 

Thoughts About the Industry

The virtual production with LED wall for set extensions as shown on The Mandalorian is probably the biggest advancement in the industry when talking about big productions, but on a smaller, more independent filmmaking scale, just the free access to these incredible tools is absolutely incredible in my opinion. The bottleneck of such projects is that mocap tracking/solving and keyframe animation still take some time. I’ve explained it briefly in my process video, but I can totally see a world where we get perfect mocap directly on the shoot playing in real-time within Unreal. If that were to happen, we would be able to shoot the scene on a shot-by-shot basis like a live-action film and output those shots from the virtual camera as final shots directly to the editorial. That would streamline the process even more so, but you’d still be able to open up the Unreal project and make any changes after the fact. And like I said in my process video, I think that day is closer than we think.

Sava Živković, 3D Artist and Director

Interview conducted by Arti Sergeev

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more