There is immense pressure on the screen sector to cut budgets. At the same time, audiences and creators want to conjure any time, any place and any mood in the galaxy onto a screen. There is a fierce hope that somehow this problem can be resolved by some combination of post production technology and games software.
Mercury CX lined up six people to consider this problem. The first impression is that Adelaide is trying to make itself a centre of excellence via Kojo, Rising Sun, Flinders University, the state agency and a heap of growing production and services companies. That is worth noting on a national scale.
Secondly, there are extremely clever people who have worked on this for decades. Like the boomers before them, a group of really young smarties, often women, took control of a new way of seeing the world, and are now very experienced. Their forebears liked power, but this generation seems to like puzzles.
The first rule is, hire experts and learn like fury. Like anything involving computers, dumb ignorance will ruin your day and your budget over and over again.
What is virtual production?
Computers are exploding everything about screen creation. While linear projects developed complex post production methods, the animators and the game creators built separate zones with similar technology and deep learning. Now all these computerised processes have bled into each other at the same time as traditional linear storytelling post started to move into production. On-set data management arrived, preliminary colour grading, along with green screen and the combination of actors with their environments.
Then the whole lot burst forth, so now we can create anything on screen, and post production technologies are turning up in planning and on set. But it takes a long time and costs a lot of money and involves enormous crews of specialists. VR collides with AR collides with XR, but NOT to create virtual reality programs. That is another world, at the moment.
The key to this seems to be the development of the Unreal Engine, originally created in 1998 by games company Epic. It has evolved continuously since, and is well described on the website as ‘the world’s most open and advanced real-time 3D creation tool’.
Bree Whitford-Smith, the Head of Production at ModelFarm, came sideways from the games sector with Unreal to create a business in architectural visualisations. She is now working on two children’s series in both green screen and pure animation, which have turned to virtual production because of COVID-19.
‘In animation and visual effects we are so used to having a wonderful pipeline with different artists in very niche skillsets. Because Unreal offers this nonlinear production workflow, suddenly you need people who can do all of them, so they can model, they can texture, they can light,’ she explained.
Learning that complex combination takes a long time and they scrabbled for talent.
At least Unreal supplies a huge number of tutorials so entrants to the sector can show their willingness and smarts by learning the basics in their own time.
Meredith Meyer-Nichols was an early adopter who worked with James Cameron on the original Avatar and went to Framestore and Industrial Light and Magic in London. Now the Executive VFX Producer at Rising Sun, she has 15 years of history in her head.
‘At the time everything was bespoke,’ Meyer-Nichols explained. ‘So you had to have a major film studio involved and a major visual effects studio and we were building everything from the ground up.
‘And now there are costs and hurdles but there are so many more options. You don’t have to be James Cameron, but you do have to use the thing right.
‘Pick the pieces that work for you, whether it is motion capture for your animated character, whether it is using LED walls to get into interactive lighting and environment, whether it’s just using visualization up front, as long as you’ve got someone guiding you and not somebody who is just a cheerleader for a particular technique,’ she told the audience.
And budgets?
The process can be ferociously expensive. Unreal is paid for by the minute of use and it is not cheap. But the session featured Heath Ryan, an Australian who has built Pace Pictures, a small virtual studio in Hollywood. He comes from an editing background, cut his first project on film, and is determined to use the technology to create opportunities in the low budget space. As his website says, ‘His 9000 sq foot studio boasts a Dolby Certified Atmos Mix stage, 4k HDR Color theatre, ADR sound stage, sound and picture editing suites and a virtual green screen shooting stage running on Unreal engines.’
Ryan is old enough to joke with Dale Roberts, the CEO of Kojo, about all the technical toys they have bought which didn’t make money, but were instructive. He took the audience through the basic moves of virtual production, in which an actor is filmed in a green screen studio and pops up immediately in the image of the whole space. It is then possible to manipulate the apparent shot position and the lighting, as if there is a crew inside the computerised environment.
His first discipline is to use Unreal for the absolute minimum of time. There is a lot of technology like Maya which can do bits of the job, even though the other programs are harder to integrate.
His second is to use the pre-planning process to create the shortest possible shoot. So the pre-production process includes a much more leisurely and actor-free studio period in which all those assets are constructed. Ryan managed to shoot a feature in five days, attracting better actors than the price indicates because they knew it was so short and simply went to the studio.
Meredith Meyer-Nichols said, ‘One of the great things about virtual production and about about Rising Sun is being involved in those really early pre production and conversations and planning, all of which helps us immensely both in terms of look and financially at the end.’
Creating a fluid, nimble process
The group worked through the idea that this creates a kind of continuous clump in which producers can work out particular pathways to reduce costs. Is that table real? With his editing hat on, Ryan talked about the way a scene can be constructed. The wide shot, of course, establishes the environment, which is the bit with all the bells and whistles. After that, scenes tend to be singles and two shots, with limited backgrounds which may be out of focus anyway. He admitted that he goes through scripts to see whether anyone touches anything. It’s a lot cheaper if they don’t because the object does not have to be integrated.
In the past, the rigid pathway from production to post meant that altering a shot required extensive changes, unzipping files way back in the process. Now that problem is disappearing because changes can be rippled through the virtual environment. Have to change an actor? It’s a lot easier and even when you have to shoot a substitute, all the setups can be recreated immediately and may never have been real.
We are seeing the evolution of a digital land grab, in which a company can shoot an enormous amount of footage of a particular location, deep in the desert or up on the Eiffel Tower, and simply sell the stock footage to filmmakers who can then manipulate the images and the point of view to their heart’s content. You want a dragon? We have lots of dragons.
Heath Ryan recounted a conversation with James Cameron. ‘It’s just gonna all gel together and there’s not going to be pre-production and post.
‘And, obviously, you still have to have a script and development but you’re going to end up with a very fluid way of making films, and everyone that’s involved in the end outcome is going to be part of every process.
‘But that has not happened yet in virtual production. By really pushing this, independent film is going to boom.’
Dan Thorsland, a former comic book writer who is responsible for the VR setup at Flinders University, pointed out that, ‘I think everybody would agree it comes down to that essential film, and the emotional experience of viewing of film has been around for over a hundred years. The language of shots and editing and performance and music hasn’t changed that much.
‘Its a really big part of our culture and these are just tools, they won’t necessarily change it.’
While Thorsland sees these as storytelling systems, they are also techniques which shape budgets and who gets to spend them, and ultimately how they are distributed. Simply by gathering together, the panel shows that people with different affinities and passions are taking control. Is this a form of democratisation, or is a technically abstruse sector become an endless hall of mirrors?