What would a movie be without VFX? In this deep dive, we look into the stages involved in the art that is shaping our reality. Let’s dive into this ultimate guide to the VFX Pipeline.
Here’s a diagram of the VFX Pipeline to study:
What is the VFX Pipeline?
Have you ever seen a video of a factory? Perhaps you have noticed the use of assembly lines in many factories. The VFX Pipeline is like an assembly line for VFX shots within a project. There are many steps needed to create the stunning visuals that we get to see in the cinema.
The VFX Pipeline is aligned with how far along the film is in the filmmaking process. Think about it, if the shooting hasn’t begun yet but the VFX artist are already needing footage to work with then we have an issue. Because of this the VFX Pipeline is paced to keep into step with the filmmaking process.
The Filmmaking Process is divided into 3 phases:
Pre-Production:
This is what happens before the actors even set foot on the sets. Pre-production is a surprisingly important part of the pipeline and is full of planning and storyboarding.
Production:
This is the part where the actors shine as the actual footage is recorded. What place does the VFX team have in the production stage of a film? The sheer magnitude of their role might shock you!
Post-Production:
Editing time! At this point, all the talented editors, colorists, and sound designers step in to create something truly spectacular. This is where the magic happens. Most parts of the VFX Pipeline happen during this stage.
Now let’s look at each step of the Pipeline.
Planning
We begin at the Pre-Production phase of the pipeline. At this point there are no fancy simulations or complex CGI scenes, but failure here could endanger the entire project.
Planning is a super important part of the pipeline, and there are plenty of methods that are used in the planning of a VFX project. Let’s take a look at some of those now:
Previs/Storyboarding
Have you ever started a filmmaking project and then realized that you had know idea what you wanted your shots to look like? This could become a major problem for any film studio wanting to produce a project. It would mean that precious moments would be wasted while they decided on lighting and camera angles.
Fortunately there are tools to make sure this never happens and these tools play a major part in the VFX Pipeline. These are called Pre-Visualization tools. They are there to help the VFX artists and filmmakers organize and visualize their ideas for shots and sequences throughout the film.
Storyboards:
Often there are many previs artists working to create material such as storyboards and animations. I’m sure you’ve seen a storyboard before. A storyboard is made up of a series of cells with artwork drawn to depict a certain scene or shot within a film. They are helpful for the planning of camera angles lighting setups. They are also helpful for a VFX team to see what effects will be needed for each shot. This means that the team can plan for whatever technology(like a green screen or a performance capture suit) will be needed. This is extremely important in making sure that the production goes smoothly.
Animations:
This planning does not only take place in the form of storyboarding, animations are also often used. Animations can show movement of characters and cameras far more accurately than a storyboard or drawing. This means that they are often used for complex action scenes with lots of movement going on. These aren’t Pixar level animated feature films, but they can get the job done.
Learn about someone who does a lot of planning in the VFX Pipeline.
Production
You might be thinking to yourself, what place does a VFX artist have in the filming stage of a movie? Maybe you thought that a VFX artist spent all his time behind a desk, and if you thought so, you technically would not be wrong. But, members of the VFX team do in fact spend time on the set of a movie. Imagine what would happen if the crew(who knew nothing about VFX) had to shoot the whole movie without knowing how any of the VFX was going to be made. It would be a nightmare for both those who were on set and those who had to try and create Visual Effects with inadequate information about the scale of sets and what lighting was being used. VFX is all about small details, and having a team from the VFX department ensures that all those details can be recorded and captured.
Coordination
A word you see a lot in the credits of movies is coordinator. The reason for this is that a huge amount of coordination is necessary for a movie to be made. Things like motion capture have to happen which means that a system has to be set up. Someone needs to have a computer with the right software on it to receive data from the mocap suit and record it. All of the software should be already downloaded, tested, and learnt.
Think about a shot where, for example, the VFX team had to extend a small set. The film crew wouldn’t know what was needed to make the shots possible. Maybe they knew to set up a blue screen but as soon as the camera started to move, the problems would start. The VFX team would need to track the scene and without good tracking markers, that would be nearly impossible. If we had VFX artists who were on the set, they would know all of this information and prepare accordingly. In a later section we will look at these concepts applied to filmmakers.
Creating Digital Elements
Oftentimes, VFX projects need some kind of digital assets. This could be almost anything:
- A Digital Double of an Actor
- A CGI Environment or Set Extension
- A Stylized Character
- A Creature
- A Vehicle
These can all be created by hard-working 3D artists. These are the same types of people that would be working on a Pixar movie but often they have more of a focus on realism. In the video below, we dive into how this is done. By mastering these core techniques, artists can contribute significantly to the creation of compelling and immersive visual effects.
We are starting to gather the ingredients of a perfect VFX shot but something is missing. Motion.
Applying Motion
Imagine having a fancy car that just looked stunning. You show it off to all your friends and for a while, they are impressed. Eventually, one of them asks for a ride in the car. “Unfortunately this car came with no engine”, you reply. At this point, your car is suddenly far less cool than it was a minute ago.
The same would apply to any digital element that lacked motion. A character without motion is dead, many scenes without motion lack visual interest. This goes to show the importance of motion in a VFX shot. Let’s briefly look at some methods of applying motion.
Animation
3D keyframe animation is a fundamental part of Visual Effects. You have probably seen this method of applying motion in an animated Pixar film. Let’s look at the two main elements of this method now.
- Keyframes
- Keyframes are(as the name suggests) key to making an animation. They act as defining points in an animation. A keyframe tells the computer information about the properties of an object in a 3D scene, and can change over time. This could be the location and scale of an object for motion or the brightness of a light for effect.
- Inbetweens
- Generally it would be too time-consuming to create keyframes on every frame of an animation(typical animated shots could be many thousands of frames long). This is where inbetweens step in. These are the frames in-between the keyframes, and are generated by the computer. They show the motion needed to smoothly get from one keyframe to the next.
With a combination of these 2 elements, animators can create clean, complex, and complete animations.
Simulation
This is one of my favorite elements of the VFX Pipeline. It is a method that brings both unique motion and realistic motion to a VFX shot. It involves real-world Physics and can be scaled to a massive level. It can be used for anything from a glass smashing to a tsunami destroying a whole city. There are a few things you can simulate:
Cloth
Cloth simulations can be used to turn any 3D geometry into its cloth equivalent. This cloth can blow in the wind or fall onto the ground. This could be used to make a flag or some clothing.
Destruction
Using basic physics a computer can calculate damage to 3D geometry. It can be broken into chunks and thrown across the screen. This is used in superhero films when the villain is at work. This can be used together with the next types of simulation to create convincing destruction
Particles
In order to create effects like fireworks, rain, and leaves blowing in the wind VFX artists use particle systems. Particles are a simple yet flexible way to control a large number of 3D objects. In a particle simulation there are settings that control the behavior of all the particles. This, combined with randomization, can be an easy way to control a lot of moving objects within a 3D scene. This could be bricks or dust during an explosion or leaves and papers during a storm.
Smoke and Fire
There are some people who seem to think that explosions and fire make any scene better. While this isn’t the best philosophy, I sort of see what they mean. There is nothing that pleases me more than a well executed explosion. Because of the nature of smoke and fire, animation isn’t a plausible method when it comes to making this type of effect. Simulations are the way to go, as they are great for handling this type of dynamic motion effect.
Fluid
You can also simulate water and other fluids flowing in and around a scene full of objects. This can be helpful when it comes to floods or rivers. The sim artists have to define a domain object that contains the virtual water. They can then choose an object to either become water or produce a flow of water. After this they can choose other 3D objects for the water to collide with. Water simulations are becoming so good that it is hard for an untrained eye to tell them apart from real water in a film.
As you can tell, the possibilities with simulations are near endless.
Tracking
Tracking could be the most realistic method of applying motion. This is because it involves extracting motion from reality. This is made possible by complex computer algorithms. The artist can choose a point in the footage(a high contrast point will work best) for the computer to follow through the shot. This data can be used to create a set of keyframes that can drive the motion of an object in the 3D scene.
This means we can get the most realistic motion possible because it is real motion that actually happened.
Lighting
There are a few ways to light a scene. Each has its own benefits.
The first method is manual lighting. This method is made up of using 3D virtual lights positioned in your scene to create a desired effect. These lights have settings to change the color and intensity. This can be used to create almost any lighting setup imaginable. It is great to create customized cinematic lighting in shots that are mainly CGI.
It could be time consuming to have to set up manual lights to recreate lighting from an existing piece of footage, and this is something that has to happen often. Enter 360 imagery.
VFX artists use 360 High Dynamic Range Images(HDRI) to light scenes. They can capture color and lighting data from a set or location that is valuable for added realism when lighting assets.
Once the assets are created, motion is added, and they are nicely illuminated, it’s time to render them out.
Rendering
We have all these assets with nice motion and good lighting, but they’re all stuck in the computer as 3D models. We need them as videos and images that can be used in the movie. This is where rendering happens. Rendering requires a camera. This camera’s view determines what is rendered as images. A virtual camera has settings like a real camera. These can be adjusted to match the real camera used in the rest of the film or sequence. This can be very important if the assets rendered are going to be composited on top of footage. If the camera angles and camera settings don’t match it will result in visual dissonance.
Once these settings are ready, it’s time to render. Often all renders are as image files(like a .png or .exr file). If what is being rendered is a video, then an image sequence is used. This is a folder with every frame of the shot rendered separately. This enables an alpha channel and ensures maximum quality.
In a feature film, the VFX renders have to be very high resolution and this takes an incredibly long time to render. For this reason, studios use what are called Render-farms to speed up this step. A render-farm is a network of computers with high-end graphics cards(GPUS) working together to render quickly. If a frame takes 5 hours to render and there are 100 frames, it will take 500 hours to render on one computer. If you have 100 computers then it will only take 5 hours as each one can render 1 frame. This is really helpful. Rendering leaves us with images that are almost ready to put in the final movie, but they aren’t quite there yet.
Did you know that they don’t always follow this method of making VFX shots? Read more about a new way of doing things that is changing the game here.
Compositing
At this point we probably have some real footage and some CGI elements and some greenscreen footage. These are all separate but we want one final shot to send to the editors of the film. Time for some compositing.
Compositing is the process of combining images and videos and CGI renders to create one composite image or video. Composite means that it is made up of multiple parts. At this point, video effects are also used to create a polished final shot. Greenscreens and tracking markers are removed, color from renders is tinted to match the footage, images are merged and blurred for effect.
There are 2 major types of compositing systems:
Layer-Based Compositing
In this type of compositing, assets are layered over each other like the name suggests. This is similar to your classic video editor with its video tracks. You can add text on the track above the main video track to make a title. In software like After Effects by Adobe, the same is true. You can put a greenscreen shot over a background shot and remove the green to create a background replacement effect. This is a very intuitive system to use but sometimes there can be some limitations and performance issues.
In the above image you can see how the interface is similar to a video editor.
Node-Based Compositing
Node based compositing is made up of a complex network of small boxes or nodes. These represent assets or effects. Each one has an input and an output. The output of a footage node can be connected to the input of a chroma-keying node which can then be connected to a merge node. The merge node has 2 inputs and 1 output therefore merging the inputs. This method is very different to a layer based system but the results are almost the same.
You can see all the elements of a shot in a very visual way in this system of compositing. The performance is great and it is very easy to go frame by frame and make adjustments. There is so much more to learn about Nodes(and Layers) that you can access in the video below.
At the end of the compositing stage you have a completed shot, ready to add to a movie. I hope you enjoyed this Ultimate Beginners Guide to the Visual Effects Pipeline. It was quite a journey! If you want to study this is a different form head over to UNMASK+ to watch the video series I am working on.
Cheers,
Samuel