The Game-Changing Technology of Virtual Production – Part I

Brief Overview of the Technology of Virtual Production

Introduction

During the Covid-19 Pandemic, people were not allowed to meet each other. Huge sections of many people’s daily lives revolve around meeting and interacting with other people. Almost all businesses suffered. There was however a major business that did not suffer on the contrary, it thrived. The stock price of Zoom went from around $60 in 2019 to over $550 in 2020. Why? Because the usage of Zoom and similar services extended from calling your cousins in Costa Rica once a month, to calling your boss and colleagues almost every day. During the pandemic, many businesses moved from the physical to the virtual. Now we had:

Virtual Classes, Virtual Meetings, Virtual Conferences, Virtual Tours, Virtual Concerts, Virtual Piano Lessons, and the list just goes on and on.

There is, however, an industry that seems like it could not function virtually. The film industry. At the end of the day it can’t, in the same way that you can have a virtual meeting with each person at home. But it can do something similar: Eliminate excessive travel. To you it might not seem like the film industry is one in which travel is a key element, and with the rise of visual effects, oftentimes it isn’t. However, think about shooting on location. You need to get the entire cast and crew from one location to another. This is a form of travel, and as of late it is a form of travel which can be avoided. 

What if I told you that you could create a scene in any location from a studio without any VFX in post production. Perhaps you wouldn’t even believe me. 

Well, with some great new technology it is now a reality. We call this technology Virtual Production. In the following pages we will look at what this is, and how it is changing the film industry. 

Before we dive into this, I wanted to share with you the reason I am writing this. Not long ago, I was working on material for my book “Unmasking Visual Effects” and I began writing on the innovative technology of virtual production. As I was writing and contemplating that section, I ran into a problem. I realised that I could not manage to cram everything I wanted to say on the topic into the book. It would be like trying to cram my entire library into a small suitcase.

After reflecting on my situation for a while, I resolved to dedicate an entire work to the light and magic of Virtual Production.

-Samuel Collett

Author of 

“Unmasking Visual Effects”

The Technology of Virtual Production

In 1962, Arthur C. Clarke formulated his famous Three Laws. The third, and most well-known is this: “Any sufficiently advanced technology is indistinguishable from magic”. People often view Visual Effects as some sort of magic, but today, I want to briefly pull open the curtain and allow you to have a backstage look at the Light and Magic of virtual production.

Before we jump into the technology, I want to briefly define virtual production.

Virtual Production is where XR and Filmmaking meet, resulting in Realtime VFX. Alright, let me explain that painfully knowledge-rich statement. We can start with XR. XR or Extended Reality which is a technology with the aim to combine or ‘extend’ the real world or ‘reality’ with a digital, extended world. 

Note that for this book, we will be looking at Stagecraft Volumes and not the other uses of this technology. Although the other uses are classified as Virtual Production, we won’t be looking at them.

In virtual production, we still use key parts of traditional filmmaking like a physical camera and sets and costumes. Basically how it works is that you have a huge LED screen projecting a background behind a scene. In front of that screen you have a small piece of set with your actors called ‘the volume’. All of this is recorded with a camera. 

Imagine you were in front of a laptop screen, but you were the size of an ant. This may not be dimensionally accurate, but you get the gist of it. 

To understand the elements of Virtual Production let’s think about how we would make a traditional Visual Effects shot. For a background replacement shot, you would want to have some footage on a green or blue screen and remove that background to create a foreground shot. Then you would digitally overlay that foreground over a background, either footage or a CG render.

These 2 elements are mirrored in Virtual Production, except in the physical world. The huge screen is the background, the actors and set is the foreground, and it’s all recorded using a real camera instead of being rendered in a computer. Let’s briefly take a closer look at these parts separately.

The LED Wall or Screen (Background)

Let’s break it down, starting with the LED screen. This is like the shell of ‘the volume’. The Industrial Light and Magic StageCraft LED panels used for “The Mandalorian – Season II” were 2 stories high plus a roof. We can think of them as a huge display for a computer(2 stories huge). There are multiple things that you can project on them. You could project a video onto them. For example, if you have a car chase scene, you could put a stationary car in front of the screen and project a video of a moving road onto that screen.

Now if you know your history(like, really know your history), you will probably be saying,”Wait a minute, this seems awfully like rear projection”. For those of you who don’t know your history, rear projection is a very old practice which involves projecting a video background behind a foreground. It was often used for car scenes to remove the need for a camera mounted on the vehicle and also the extreme noise as cars had no roofs(or rooves). 

Now to answer your question. Yes it is very similar, there are some things about it, such as size, that are way better now, but overall it is pretty similar. However, I mentioned that there are multiple uses for this screen, and the other one I want to mention is by far more commonly used and associated with virtual production and most definitely didn’t exist until recently. 3D photorealistic worlds.

This method became popular with the arrival of softwares like Unreal Engine 5 which enabled artists to render photorealistic scenes in real time. Unreal Engine 5 is a game engine, so if you think about it,it makes sense for it to have real time realistic rendering capabilities. These days, realistic games are becoming more and more popular and in order to have smooth gameplay, real time rendering is necessary.

This technology has been utilised for virtual production. Artists can create photorealistic worlds and move around in them. They can project this real time rendering onto the LED screens. Unlike if you were using a video, with 3D scenes, you can move the scene being projected onto the LED screen with 3D parallax which can add hugely to the realism. There is one more cool thing about this method which we will look at in a moment.

Physical Set (Foreground)

Now that we’ve covered the LED screen, we move to what is inside ‘the volume’. Remember the volume is the area within the huge screen. Oftentimes a chunk of set will be built to use alongside the LED screen. This will add an element of realism to the scene. The section is usually quite small and could be a chunk of a building or a desert or even an alien planet. As long as the set matches the background, it should work. Everything on this set is basically treated like a traditional set with lights and props and everything else. You won’t need as much light in this set but we’ll look more at that later on.

How we record all this

Now on to a huge part of virtual production, the camera. There isn’t anything hugely different about the actual camera, but the rig that it is used with is quite interesting. The rig itself has an interesting purpose. It has to track the camera’s movement. We’ll look at how in a minute, but now is the time to think about why. 

Why would you need to do live camera tracking? Well, what is live camera tracking? Live camera tracking is basically a technology where you can take the 3D location and rotation of a physical camera and get that information into a 3D program in real time(as it is happening). Why is this useful, well because if you already have a physical set and a corresponding 3D set, why not have a physical camera and a corresponding 3D camera. If you have a 3D scene, you can put a virtual camera in that scene and project the camera stream onto the LED screens, that’s all great and everything but once you look at the screen from a different angle, the effect is ruined because there is no parallax. All you can see is a flat 2D image on the screen. This means you can’t have camera motion because that would ruin the illusion, unless live camera tracking is used. That changes the situation.

Now if you remember, we have a camera with its location and rotation being transmitted to our software on the computer. Basically that means that we have the data to create a 3D object that mirrors the motion of a physical object, the camera. So now we can take our 3D camera and have it mirror the motion of our real camera. It might not seem like much, but what I just described to you is among the greatest breakthroughs in the history of visual effects and filmmaking. I just described to you a near perfect marriage of the physical world and the virtual.

Now we have a camera looking at an LED screen, and the content being projected onto the screen is moving in relation to the camera. Because the 3D camera is moving around in a 3 dimensional world, you can have the effects of motion blur and parallax. We will take a look at more advantages in the next post 

Now to take a brief look at the technology that makes this possible. It basically uses technology like a virtual reality controller(what you hold in your hands) but advanced. This can be used to get its location and other information. Often there are other stations around the area to add some accuracy. If you think about Augmented Reality on a phone, it’s like that. The phone can detect its own movement and send that data to an application. The same applies to camera tracking, except usually there is an entire piece of hardware dedicated to this purpose. It is important that the tracking is fast and reliable because if the background isn’t moving in sync with the camera, the effect is ruined.

That’s about it for this post. Remember that we use images projected onto a huge LED screen as a background for a physical set. Often 3D worlds and real time camera tracking are used.

Video on Virtual Production

5 Comments

  1. Pingback:The Game-Changing Technology of Virtual Production – Part II - Unmasking Visual Effects

  2. Pingback:The Game-Changing Technology of Virtual Production – Conclusion - Unmasking Visual Effects

Leave a Comment

Your email address will not be published. Required fields are marked *