Session Reviews the Risks, Rewards of Immersive Video

From left: Ted Schilowitz, Kim Libreri, Rob Bredow, David Alpert

The Creative Master Series session “Virtual Reality: Immersive Storytelling Meets Interactive Technology” gathered four experts to share their experiences in the rapidly developing realm of immersive reality. Moderated by Ted Schilowitz, a futurist/evangelist for both 20th Century Fox (“The Martian VR Experience”) and Barco projectors, the panel included three outliers in this emerging industry: David Alpert, president, Skybound Entertainment; Bob Bredow, CTO, Lucasfilm; and Kim Librery, CTO, Epic Games, makers of the Unreal Engine.

Each panelist spoke briefly about VR’s history: “Disney World 20 years ago had ‘immersive entertainment,’” recalled Bredow, “but the headsets cost $50,000 each and it required a computer that had to be lowered [into the venue] on cables.”

Schilowitz identified the key difference between those early days and today: “Now we have the technology to strap a theme park to our face. There will be millions of devices in a year that let us do that; there’ll be hundreds of millions in five to 10 years.”

Panelists screened some of their recent projects, but Schilowitz offered the caveat that this type of presentation comes with an inherent flaw by demonstrating immersive content without the immersive element. “Gluing it to a rectangle is not the way to show this,” he said, explaining that “camera moves” added later for the demo to approximate an immersive experience have little connection to the experience of actually moving through a virtual environment.

The first demo was of “Bullet Train,” a game coming this summer from Epic Games for Oculus Rift devices. The CG visuals bring the user into an unnamed city’s subway sta- tion to confront massive waves of assaults from batteries of soldiers to hovercrafts.

Bredow then spoke about ILMxLab, an immersive reality unit composed of members of VFX house Industrial Light & Magic and Skywalker Sound. ILMxLab’s experience for the HTC Vive, “Trials on Tatooine” puts the user in the place of a Padawan Jedi undergoing extensive training.

Alpert showed Skybound’s “Gone,” a live-action piece following the disappearance of a child in a playground and the subsequent search. This was followed by a short “making of” piece that touched on the issues related to shooting live action for 360- degree video, where the crew and equipment must be hidden or digitally removed later.

Looking to share both the cons and the pros of immersive video with attendees, Schilowitz asked the panelists to share lessons learned in the process of making these projects. Alpert noted that scripted live-action material like “Gone” is possibly the least effective use of the technology; while he’s proud of the piece, he admitted he might not do scripted live action again for the medium.

He noted that portions of “Gone” were shot in the Sequoia National Forest, “and when we had people test out the game, they wanted to look up and around because the area is beautiful. We were frustrated because they were missing story points.”

“Don’t be precious about story,” Libreri advised. “[Users] do things that might get them lost in the story. When you test it, if that’s happening, be willing to change the story.”

These projects are most successful when they are “first-person” narratives, not passive, third-person stories. Bredlow recalled that early iterations of “Trials” had testers feeling con- fused by dialogue that didn’t pertain directly to what the user’s character was supposed to be doing. As a result, a Lucasfilm story executive removed massive blocks of dialogue. Although Bredlow was chagrined to see dialogue he’d written so quickly annihilated, he admit- ted “it played much better after that. The lines that work are the ones that are about you, the user and the thing that you’re doing.”

Schilowitz, who discovered similar issues on Fox’s Martian project, said this was a uni- versal experience among the panelists: “You have to believe this is happening to you. If feel like you’re watching something happen to [someone else], we call that watching televi- sion … or seeing a movie.”

Of course, the more responsive the virtual environment is to user commands, the more powerful the real-time engines must be for rendering. From a technical standpoint, this is the key or the obstacle to progress; there must be an engine that can render 360-degree, high-resolution imagery at 120 fps and eventually beyond. “The hunger for realism continues to outpace the technology,” Libreri noted, adding that his company is always seeking ways to upgrade its Unreal Engine’s capacity.

“If you miss one frame,” Bredow elaborated, “it doesn’t make the experience a little bit worse. It makes it 90 percent worse! If you miss a few frames, your users get sick.”