You may not like westerns set in space, but certainly The Mandalorian has been a cultural phenomenon, bringing the cinematic universe of Star Wars into the realm of episodic TV.
Whilst it’s kept the fans happy, there’s a buzz in the industry too about the series. That’s because of it’s use of several new ‘virtual production’ techniques. There is now an inexorable movement towards wider adoption of virtual production amplified by the pandemic and the need for less people on set.
The ABC of Virtual Production
Virtual Production can be defined as a range of technologies that allow computer graphic imagery (CGI) and live footage to be assembled or combined on screen in real time, on the physical set itself. It means the director on set knows what they are going to get rather than waiting to see the vfx added later or seeing an edit weeks later and being unable to change things.
Virtual Production is actually the confluence of several developments; games real-time software reaching a certain level of resolution and being adopted beyond the games industry; domestic VR and positional technologies allowing camera movement to be fed in real time to computer generated scenes, and just maybe a need to cut down on the number of human beings congregating on film sets.
If we look at the history of media you’ll often find many technologies start off in the hands of big companies only, but only finally get propagated when there is a groundswell of artists and inventive technicians taking the basic principles and pushing them.
The Use of Virtual Production in Industry
More major feature films are now festooned with CGI assets interacting with live footage. There has been a demand, mainly from the director and the DoP (Director of Photography), to see those interactions happening live in front of them on set. The old methods of filming an actor in a sea of blue or green screen was frustrating as was the restriction of keeping a certain camera angle locked because the CGI had been signed off and was being rendered elsewhere, possibly in another country.
When whole scenes became mainly CGI- think The Jungle Book’s (Favreau 2016) Mowgli dashing through vegetation and landslides when only a few sparse props exist in the studio. Directors like Jon Favreau demanded some way to control the virtual camera that would end up supplying the angles, perspective and movement of the background CGI. The freedom to move the camera on set and see the virtual assets move in sync gives rise to a more live action feel.
“At that point we were starting to mess around – at the end of Jungle Book – with this consumer facing VR equipment and started to develop camera tools, creating basically a multiplayer filmmaking game in VR.” stated Favreau at SIGGRAPH 2019.
Move forward a couple of years via Lion King (2019) and Favreau is the executive producer on the Mandalorian. That’s a short amount of time for the technology to trickle down from huge feature budgets to a TV series.
No-one likes doing green screen; creatives would rather film things through the lens in situ, whether on set or location, because then what we see is what you get. One alternative solution that is evolving is that of the LED wall, or ‘Volume’. Essentially an arced wall of seamless LED panels or back projections adorn the set rather than a green screen. The live camera’s settings are communicated to the virtual camera in real time, so as the physical camera moves the CGI background synchronises.
As you change focus or view on the real camera the CGI wall recalibrates in real time. This is often also preferred as the actors are bathed in real interactive light from the LED background, rather than drenched in what is often called green spill – green light bouncing around onto the physical actors that needs to be removed later in costly post-production. In cases where characters are quite reflective - think of the Mandalorian’s shiny armour - this LED interaction can work well adding lighting realism. Actors also feel they have a scene to emotionally respond to.
Now there still are not many LED Walls around, but they are coming and small companies are now finding ways to create their own cheaper DIY approximations.
How Virtual Production & The Mandalorian inspired Escape Studios
The future of ambitious film and tv production is being made by collaborative mixes of games and VR tech, film craftspeople and technologists. It’s this exciting mix of art and tech that Escape Studios embraces.
Like Jon Favreau we have established a firm relationship with Epic Games’ Unreal Engine. In fact, Escape Studios is the only authorised Unreal training centre in the UK, and one of a handful across Europe. Our tutors are experimenting with this tech so our students get access to the latest developments.
Cinematography tutor Clem Gharini has been publishing his updates on Unreal for a while now, using his skills in lighting and lenses, as well as investigating motion capture gloves and suits. Unreal is now part of our Cinematography module. Head of Games Simon Fenton has been leading on several industry briefs like building an explorable athenian Parthenon. Over the summer of 2020 games programme leader Phil Meredith and tutor Tom Harle retrained furloughed VFX artists from major international companies in how to use Unreal to visualise scenes. Currently Michael Davies (Programme leader for Animation) is designing a two week Unreal Engine course for freelancers across the UK. Animation tutor Amedeo Beretta has been experimenting with inertial motion capture set ups for students, and 3D tutor Dan Shutt has been experimenting with a giant LED space in Los Angeles. That’s just in the last six months.
The Mandalorian TV series was recently recognized at the Visual Effects Society VES Awards, clinching the Outstanding Visual Effects in a Photoreal Episode and Outstanding Model in a Photoreal or Animated Project prizes. At Escape Studios we are working to make sure our students are the next generation of award winners for this new technology. Watch this space!