A Brief Study of Innovative Technologies – Part I6 min read

“If I had asked the public what they wanted, they would have said a faster horse.“

Henry Ford

You might be wondering what horses have to do with VFX. The answer is: absolutely nothing. That is not the part you should be focusing on. Look at the quote again. Sometimes people just like the way things are. When we ask them how to make their job easier they’ll just say something like: “Well, if someone made the tools faster, maybe I could do my job faster”. VFX artists may think that if the computers or VFX program got faster, so would they. New and innovative technologies are how jobs are significantly sped up. Just like Henry Ford invented the motorcar, many others have invented and utilized tools that can speed up the VFX pipeline. This includes studios like ILM. 

ILM has received over 24 Scientific and Technical Awards from the AMPAS(Academy of Motion Picture Arts and Science). Does this seem like a studio that is staying the same and shunning change and innovation? On the contrary, this shows that those at VFX studios strive for change and discovery. Just because VFX is a highly technical field, that doesn’t mean there is no room for technical growth.

In this study we are going to be looking at some of the newer and more exciting VFX methods.

The first of these is motion capture which we already had a quick look at. Motion capture has brought many iconic characters to life. How did this amazing technology come about? Motion capture is associated with 3D work, but a form of mocap existed long, long ago. Mocap is used for 3D animation now but in the past it was used for 2D animation as well. Animators wanted to have realistic animation, so they turned to reality and used videos. These animators traced over the videos to create realistic animation. This is technically an early form of mocap as it involves capturing a performance into animation.

Now, this process has become at the same time both simpler, and far more complicated. There has been a huge amount of automation. Now you can see in real time, a mesh or skeleton which is mimicking the actions of the actor. Most systems use a suit. Some suits have sensors on them while others have reflective markers for the computer to track. The cheaper and more common method is using sensors. These detect gravitational pull and are like little trackers/accelerometers most suits have around 20 of these, working together to create an accurate track of the performance. This is how characters like Gollum and the Hulk came to life. These characters have another attribute that we haven’t yet looked at, they talk. You can have a character with the most realistic motion on the planet and it’s all worth nothing if you have a lifeless face.

Facial movement adds the sauce that a character needs to be realistic. The problem is, faces have a huge amount of detail, too much to be captured from a distance. You need a high resolution camera close to the face to capture the details. This is where the head mounted camera comes into the picture. This is a camera attached to a helmet on the actor’s head. The systems usually track dots on the actor’s face. The more dots there are to track, the more accurate the track will be. After all the main performance capture, 3D artists and animators have to fix and edit the animations to make them more lifelike. Even though the data is captured from a human, there will still be some minor glitches to fix. Sometimes the animations will be jerky and seem robotic. If this is the case, smoothing will help, but manual adjustments are still needed.

Another incredible technology that has been developed is Virtual Production. I have touched on this before, but let’s take a quick look. Virtual Production (more specifically StageCraft) was really made famous by the show, “The Mandalorian”.  There are so many technologies that are involved in virtual production. One of these is Virtual reality. Virtual reality is often associated with games, but can also be used to create real time scenes. VR provides a quick and easy way to view a 3D scene in real time with dynamic motion. If you are making a movie on location, it’s always best to be at that location. VR allows filmmakers to be present in virtual locations.

Augmented reality or AR is another huge one. AR is basically producing realtime VFX and overlays, usually using a phone with a camera. Using professional film gear, you can really take this to the next level. Virtual Production has been used for sports broadcasts and is popular in the television industry because it is real time. You can view, edit, and stream 3D work in real time. The reason that it stayed away from the film industry is because the result was stylized and not realistic. Remember, realism is something that all VFX strives for, so why bring in a technology which is known for its stylistic look and feel. Well, realism is a complicated thing to achieve, while it’s relatively easy to create something that is obviously meant to be fake and look fake. Not long ago it was really hard to create realistic scenes that could be viewed in high quality real time. That all changed when people found ways to use the same tools that had previously been used to create realistic games, to make VFX and virtual production. Another word used for this kind of production is XR or Extended Reality. XR is made up of many elements. Usually you have an actual camera, a set, and an LED screen. The LED screen is twisted around the set to provide a dynamic background. You can then project a 3D world onto that environment and create a set to match that world. The only problem is that if you move the camera, the Parallax will just break, ruining the illusion.

 This is where trackers come into play. Another technology which originally was used exclusively for gaming and VR. These trackers were used to translate a real world object’s coordinates into 3D space. When you see someone playing a virtual reality game, you often see that they are holding two of these trackers, one in each hand. This is so that the games knows what their hands are doing and where they are in 3D space. This ability to translate realworld position into 3D space has enumerable uses, one of which is real time camera tracking.

If you attach one of these trackers to a camera, you can then make the 3D camera copy the movement of the real world camera in real time. You can then make the background move in proportion to the real camera movement. This adds a tremendous amount of flexibility and opportunity to Extended Reality.

You can now have actors on a small set, surrounded with LED panels which are displaying a virtual world. The camera can be moving and making the scene more dynamic. These huge LED panels are also casting light on both the set and the actors. The overall effect is a convincing one. Not only is it convincing, but also helpful. You can now film on location without changing your location. Truckloads of cast and crew and gear and  the list just goes on, can now just set up in an indoor studio and start rolling. Instead of driving to another location, someone just has to open the next file on a computer. This can really save time while still having great realism for both the actor’s performance and the audience.

Leave a Comment

Your email address will not be published. Required fields are marked *