Visual effects helped director Robert Zemeckis tell Philippe Petit’s story in The Walk. Not only did the VFX bring back long-gone time and places, 1974 New York and the World Trade Center’s Twin Towers, they did so using technologies never employed at this scale.
Baillie is co-founder and VFX Supervisor at Atomic Fiction, which pioneered the use of cloud computing for rendering on The Walk. He is also CEO of the cloud rendering platform, ConductorIO, which will be released to the public by the end of 2015.
The Walk covers the space previously addressed in James Marsh’s doco Man On Wire, following Petit back and forth between the Twin Towers on a tightrope 1400 feet above the ground. The total budget for the movie was $35 million, about a tenth of that normally required for the VFX alone because of the need to recreate 1974 New York, 3D conversion and the need to tie it in with a proposed virtual reality experience. The brief, Baillie admitted, demanded some lateral thinking.
Atomic Fiction did the vast majority of the VFX work between the towers. Other companies recreated Notre Dame, the lobby and the virtual reality tie-in.
Early on in the shoot, Kevin had the opportunity to go up in a helicopter above “ground zero” and got a real experience of awe as to what Petit had achieved. He used this experience as a guide to the subsequent VFX work.
In order to keep costs down there were a high number of long shots – 826 – in order to reduce constant changes of set. 672 of those shots were VFX.
Originally it was thought that the rooftops of both Notre Dame, where Petit did some of his earlier tightrope walks, and the Twin Towers could be used as sets in the movie. However, budget constraints limited this to just the railing of Notre Dame and only one corner of the Twin Towers where most of the action took place.
Atomic Fiction went through thousands of reference photos from the 1970s. They spent a considerable amount of time determining the balance between being realistic and being appropriate for the film in terms of texture, colour etc.
The focus was on those areas of the city relevant to the film, not the detail of the whole of Lower Manhattan. However, Atomic Fiction quickly discovered that they needed to plan for 360 degree camera views even if they were only using 180 because of the different lighting scenarios that came into play through day and night.
Atomic Fiction could not afford the hardware to do all the rendering The Walk required, so decided to do the rendering in the cloud, using computing capacity available online such as that owned by Google. In order to do this, they developed software ConductorIO to manage the whole process. It meant that they only paid a fee for the time spent rendering, and not on equipment which would be obsolete anyway within a short period.
A company called Create made the virtual reality experience which enables users to “walk” between the towers. However, Atomic Fiction were able to share assets with Create to help reduce costs.
Again, the budget was small and only one developer and off-the-shelf tools were used. The experience was set up on a PlayStation 4 console (which led to a collective groan from game developers in the audience, who noted that a fairly standard PC would have 10 times the processing power) and a Sony Project Morpheus VR headset. Resolution was high at both 4K and 8K. Again all rendering was done in the cloud which was very demanding for a fully immersive 3D experience.
Amongst the issues requiring resolution was user confusion around the boundaries of the experience and safety. The VR images were so real that users were literally falling over if they “fell off” the tightrope and had to be steadied by a safety handler.