Skip to content

Visual Effects Pros Are Virtual Reality's Vanguard

Visual Effects Pros Are Virtual Reality's Vanguard

When visual effects powerhouse Industrial Light & Magic announced the advent of its ILMxLAB in June last year, people very quickly got excited about the prospect of seeing much of the Star Wars universe in VR. But behind the announcement, too, was a recognition that the skills and artistry inside a visual effects (VFX) studio were extremely well suited to crafting VR, AR and MR experiences, and for 360-degree video.

We talked to ILM, and several other heavy-hitting visual effects studios now in the virtual reality and immersive experience space, about why VFX and VR are so inextricably linked.

VR is VFX all over again

VR may be stunning the world with some groundbreaking tech, but for visual effects studios solving the many challenges behind VR turns out to be quite similar to the challenges they have been facing for some time. ILM, for example, made what it says was one its first forays into AR and VR with Steven Spielberg’s A.I. Artificial Intelligence (2001).

“ILM developed a system where we could visualize sets in real-time on location,” explains ILMxLAB visual effects supervisor Tim Alexander. “It involved a combination of real-time graphics and camera tracking. We saw the value in this and have continued to improve the tools to enable what we call Virtual Production.”

ILMxLAB and Magic Leap have joined forces, with their final collaboration still be demonstrated. This is a “Lost Droids” Mixed Reality test.

ILM has continued to add technologies like motion capture, real-time facial capture and camera tracking, and real-time rendering into the mix in film visual effects, and now immersive experiences such as Trials on Tatooine. Of course, ILM was not the first VFX studio to jump into VR.

One of the early adopters was Framestore, which won an Oscar for its work on Gravity (2013). It launched head-first into VR with the Oculus Rift Game of Thrones: Ascend the Wall experience and credits the ability to tackle VR with a rich history of real-time experience at the studio.

“We’ve had 3D artists working in real-time pipelines prior to getting involved in VR,” says Christine Cattano, the Global Head of VR at Framestore. “We’ve long targeted game engines as a way to bring high-end visuals into an interactive space. In addition to being excited by the technical challenges, creatively speaking they have spent much of their careers thinking in three dimensional space.”

https://www.youtube.com/watch?v=bBZhuqPRx9c

The Nike Hypervenom II – The Neymar Jr. Effect VR experience, which Digital Domain worked on.

That understanding of 3D is echoed by other VFX studios, too, in their comments about why they got into VR.

“Our compositors, lighters, and trackers have excellent spatial awareness,” states VR Supervisor Aruna Inversin from Digital Domain, a studio famous for work on Titanic and The Curious Case of Benjamin Button, among countless other films. “They understand the limitations of 360 video, and how to build such worlds effectively within our software packages.”

Indeed, VFX artists probably understand these issues better than most, partly because they’ve been dealing with them for a long time in traditional effects work.

“We often joke that every problem we crack is immediately revealed to be an issue that was solved in the 90’s and just forgotten by time,” shares John Fragomeni, President of Mirada, a studio that was heavily into visual effects for commercials before VR also became a strong focus, including the recent Mr. Robot VR experience. “We immediately realized that 360 degree video uses the same workflow and tools as high dynamic range (HDR) imagery; obviously it’s not that simple but the foundational issues are the same.”

Uncorporeal’s ‘Holographic Photography’ demo.

Ollie Rankin, a visual effects supervisor who has credits on films such as White House Down, X-Men: Days of Future Past, and the upcoming Miss Peregrine’s Home for Peculiar Children, also sees the cross-over between VFX and VR as part of that larger convergence involving gaming. Rankin is part of a new VR studio called Uncorporeal with the goal of creating visually accurate volumetric performance and environment capture, in real-time.

“GPUs and realtime lighting/rendering engines are getting more and more photorealistic,” he says, “while visual effects production schedules are getting closer and closer to real-time – I’m kidding. But visual effects companies are doing more and more GPU-based computation and are using real-time engines for things like previz and virtual production.”

Interestingly, like Rankin, several artists with experience in VFX have moved into the VR space, capitalizing on their knowledge of compositing, 3D capture, rendering and content creation. Examples include members of the team at 8i, Oscar winning visual effects supervisor Kevin Mack’s Shape SpaceVR and those at Felix & Paul Studios.

Along with several VR experiences, Framestore has also adapted VR and real-time game engine techniques into other experiences, such as this Mars bus trip in partnership with Lockheed Martin.

Skills for the job

VFX artists have the technical skills for VR, but they are also possess another asset up their sleeves: problem solving. They are hit with complex issues all the time in visual effects that are all about getting the shot done, and VR is certainly no different. One visual effects studio, Luma Pictures, sees the VFX industry as the perfect place to take advantage of these skills for VR.

“There are no entities better suited to harness and advance the VR medium than visual effects studios,” said Luma Managing Director Jay Lichtman. “We’re world builders who are now able to generate fully immersive and interactive worlds that VR allows for.”

https://www.youtube.com/watch?v=I3LaFY7_Z9A

Luma Pictures contributed to an Infiniti ‘Pencil to Metal’ experience – this is the making of.  

Of course, when visual effects studios saw how applicable their work could be to VR, it made business sense to enter the field, partly for fear of being left behind. That’s something The Mill’s Group Director of Emerging Technology Boo Wong recognizes.

“Never before in our industry has it been so important to join concept development with technological expertise,” Wong said. “Traditional film experiences offer a dictatorship of attention for the storyteller. This century-old relationship is broken by VR and how we develop, direct and craft has therefore changed remarkably.”

By being in both visual effects and VR, effects studios can also re-use and re-purpose assets they make for films into other mediums, and have the same artists contributing to the process.

“To be able to transfer a film asset that has been fully developed in modeling, texture, animation, simulation and look directly in a VR experience is a key goal so that we can leverage all of the skill and artistry of our proven production visual effects workflows,” says ILMxLAB’s Alexander. “Specifically, we find that lighting and simulation artists gravitate to the real-time tools well; modeling, texture and animation artists are able to work in a familiar realm as well to produce immersive content.”

The Mill has been involved in several VR and immersive films, and has used some of its VFX background to help develop an on-set stitching tool called Mill Stitch.

Still a lot to solve

It may be that VFX skills are easily transferable to VR, but, as many studios we spoke to stated, that doesn’t make crafting VR experiences necessarily any easier. And there are new problems to solve, too, such as planning a VR experience, editing in VR, seamless stitching of footage from multi-camera rigs, real-time and GPU rendering, reviewing VR material and re-thinking VFX techniques for VR.

One of those problems is an amalgam of visual effects and VR needs; compositing. Ollie Rankin notes that this final step in the VR process (particular relevant for live action experiences) is still incredibly difficult. “All of the 2D camera-space corner-cutting tricks go straight out the window. You have to make a complete and seamless world.” ILMxLAB’s Alexander also recognizes those same VR compositing challenges, and wishes that “real-time compositing tools become the norm so that we can achieve more filmic and photographic experiences.” Certainly, software companies seem to be working on such tools, including The Foundry with its CARA VR plugin for its industry-leading compositing toolset NUKE.

Mirada worked on post production for the Mr. Robot VR experience.

Plus, there are other things that VFX studios feel they can help bring to the table for VR. Digital Domain’s Inversin says his studio is “developing a number of tools that will help bring live action photography into real-time cinematic engines,” while Framestore’s Cattano suggests “increased content creation will spawn increased use, which will encourage new tools and improvements to be developed to solve some of the content creation and usability hurdles” – with VFX studios poised to do that development.

Still, as Mirada’s Fragomeni observes, VFX studios routinely rely on workarounds to make VR a reality, another example of the problem solving at work.

“There is no magic bullet that allows one to film a no-stitch-required stereo 360 degree video at 90fps-plus with similar quality as digital cinema cameras,” he said. “So everything is a compromise between quality, size of rig, distance of subject, light level, stereo vs. mono, etc. Some day there will be a hovering camera the size of a marble that shoots perfect 360 degree lightfields at 120fps, but until then we have a lot of duct tape and bubblegum to get through!”

Member Takes

Weekly Newsletter

See More