host_shape-https://shape.att.com
host_ddp-https://developer.att.com
env_shape-prod
env_ddp-prod
Published: Feb 26, 2020
Updated: Feb 27, 2020

Using AI to Create Immersive Films

Author: Ed Schmit

Immersive experience

Today’s moviegoers have a variety of options to watch their favorite films in. Digital AVX, 3D, DBox™, and more. Each one offers a unique way to draw in audiences, giving them an opportunity to experience the movie in different ways. New technology like virtual reality (VR) gives filmmakers a chance to create experiences that are more immersive than traditional films.

Ed SchmitAuthor: Ed Schmit, AVP Product Marketing Management, AT&T Developer Program

Ed tracks new technologies for the AT&T Developer Program. His specialties include network technologies, technology enablement, and strategic marketing.

The potential of filmmaking is expanding as fast as the technology is evolving. Directors have exchanged traditional cameras for 360-degree cameras that let them capture actors from all angles. Editors and special effects professionals use computers and software to power the detailed worlds we’re getting used to seeing on-screen today. It’s not just the sci-fi or action films that are using the new technology to tell their stories; even regular movies that take place on Earth use software and technology to create fully immersive stories (and save money on location shoots or subtle special effects).

All of which raises an interesting question: what will films be like in 10 years? In 20 years? The rise of AR (augmented reality), AI (artificial intelligence), and related software is expanding the type of stories filmmakers can tell, so where will it take us?

How AI Impacts Film

Technology, and AI specifically, is having a significant impact on filmmaking today. From helping us tap into more emotional stories to creating “volumetric” experiences, AI is driving the new frontier of films.

Getting personal

In the early 1980s, the Choose Your Own Adventure book series came out. It was one of the first times that the reader got to drive the story they were reading, and readers loved being able to tailor their experience for themselves. It was the start of story “living”, instead of story “telling”. The reader was in charge, not the author.

In today’s world, that experience can be translated to the screen by VR, augmented reality (AR), and other software. Viewers will be able to craft the story in real-time, based on their likes and dislikes, creating a truly unique experience just for them.  Advances in AI technology will keep the stories as unique as each viewer is by having the computer-created characters in these films respond to viewers in real-time and reacting to the choices viewers make.

Karen Palmer, the “Storyteller from the Future”, is an award-winning artist and speaker who has created RIOT, an emotionally responsive, live-action film where viewers can experience a riot at a protest march. Using facial recognition and AI technology, the film responds to viewers’ emotional state in real time during the riot, changing what they experience throughout the journey through the film. Characters in the film react to viewers’ facial expressions, influencing the outcome of the interactions. The RIOT digital experience uses branching video scenarios that are triggered by specially-written facial recognition software and neurogaming software to provide viewers with a personal experience every time.

Getting “volumetric”

The Star Trek universe had the holodeck, a fully-immersive environment where people could interact with any kind of situation or object they wanted. While we’re not at this level yet, the technology that brings us immersive experiences is evolving at a rapid pace. But we’re still in a “flat” mode when it comes to our experiences. Sure, it’s not traditionally flat media like books or posters, but we’re still experiencing a lot of media in two dimensions on a screen.

Once you add in the creativity of filmmakers, however, and the idea of film staying “flat” goes out the window. They are used to bringing audiences into their world in a more visceral and real way than even book authors are, and will rely on the evolving technology to make that even more realistic in the future.

Nonny de la Peña, the “Godmother of VR” according to The Guardian and Engadget, and founder of Emblematic, an award-winning VR/AR company, uses VR, AR, and mixed reality (MR) to break down the framework of films and engage directly with viewers in what she calls “volumetric” experiences. She uses all of these technologies to convey stories that “create empathy and preserve our humanity” we remember and understand stories better when we experience them with our entire bodies, not just our minds. “With VR,” she told VFX Voice, “I can put you in a scene in the middle of a story you normally see on the news and lessen the gap between actual events and personal experience.”

As more people grow up in this online world, they’re going to want more immersive experiences. They’re used to the idea of watching a movie in VR or playing video games with AR. They’re going to want more of their media, whether it’s entertainment or news reports, in these fully immersive, fully “embodied” environments. And those experiences are here.

  • Alumette, a VR film written and directed by Eugene Chung of Penrose Studios, makes use of VR technology from both a filmmaking and audience perspective. The newer VR headsets like Oculus Rift and HTC Vive offer positional-tracking capabilities (six degrees of freedom, or 6Dof) that make VR movies more immersive and intriguing for storytellers. “Six degrees of freedom is huge,” Chung told Wired. “It adds so much to what we call ‘presence,’ which in many ways is the holy grail of virtual reality. How do we move and think in this medium?” Viewers with these powerful headsets can lean in and out to gain a closer look at objects in the 20-minute film or see the protagonist from all sides as she walks down a spiral staircase.
  • Reach, from de la Peña’s Emblematic Group, is a web platform that creates, remixes, and shares volumetric VR experiences using real people and places. People film themselves in front of a green screen and then “transport” the conversation to a volumetric environment. The resulting film can then be shared online and viewed anywhere through a web browser, and ideally, through a VR headset so viewers can get the full experience. Showcased at the Sundance Film Festival, over 200 visitors created videos in various locations such as outer space, a fantasy world, or sci-fi corridor.

Getting emotional

Other filmmakers have moved beyond VR and are using AI to create “virtual beings” and believable human characters that are governed by AI. These virtual beings marry the artistic work of computer animators with the rapid progress of AI to create characters that behave like actual humans.

Edward Saatchi, VR pioneer and co-founder of Fable, has pulled together some of the best minds in storytelling and machine learning to “create and nurture characters with style and approachability.” Their first virtual beings experience, Whispers in the Night, stars Lucy, a digital character from an adapted story by Neil Gaiman. In it, viewers and Lucy share an emotionally connected moment through a conversation that’s powered by AI natural language processing. “You are part of (her) life,” Saatchi told Venture Beat. A virtual being is a character you know isn’t real, but you are building a two-way emotional relationship with them.” He explained at Lucy’s unveiling at Sundance that in future chapters of her story, she will remember what you, what you talked about, and what she said, having an impact on the future chapters. “What you do in Whispers in the Night, takes place before Wolves in the Walls, (and) will carry over to Wolves in the Walls. The decisions you make will have an impact,” he explained.

Magic Leap, a VR headset company, recently released a new MR AI-powered assistant, Mica. She’s not like the current set of assistants that “only” turn on lights or give people directions. Mica aims to be more than just an assistant, instead, being an “educator, agitator, companion, artist, and guide” for people. This appears to be the industry’s first foray into the avatar-powered assistant world, where we will talk to a near-physical embodiment of our assistants instead of just speaking to them. While Mica doesn’t yet speak, she looks and acts like a human, smiling, shaking her head, and yawning in response to your yawns. Digital Bodies said that “all of what is missing in Siri and company is present in Mica.” According to Magic Leap, the talking part will come soon.

The technology behind AI in film

While some filmmakers are using technology to push the boundaries of what makes a movie a movie, others are using it to enhance conventional movies to blur the line between real and CGI  in conventional movies.

AI in Pre-production

Getting ready to shoot a film involves a lot of work. There’s the location scouting, casting decisions, set decoration, and more. It involves a lot of people and hundreds of hours of work, both on-set and on-location around the world. IBM’s Watson can help optimize this time by processing location and casting images quickly, making suggestions on the best choices for every decision the team must make.

AI in Storytelling

IBM’s Watson AI is enhancing the way filmmakers work on their films. In 2016, Watson helped create the trailer for 20th Century Fox’s sci-fi movie Morgan, which follows an artificially created humanoid. Watson “watched” 100 films and their corresponding trailers to learn what makes a good movie trailer. After identifying patterns in the visuals and sounds of these trailers, Watson “watched” Morgan and quickly suggested 10 scenes to include in its trailer. The film editor working on the trailer used nine of the 10, creating the trailer in a single day, instead of the 10-30 days it typically took.

To create interactive films in AR and VR, Watson can help directors create stories and dialog on the fly by using the same conversational technology that powers customer service chatbots. Watson would transform story ideas into voice-based interactions that would complement both AR and VR experiences for viewers.

AI in VFX

Production companies are using AI to accomplish movie making tasks that were previously extremely labor intensive. Arraiy’s AI can seamlessly add photorealistic CGI objects to scenes, even when the object and camera are moving. A single VFX artist can now make these changes to a film in hours, instead of a team of artists working over months. Digital Domain, a visual effects (VFX) production company co-founded by James Cameron, uses its AI technology to incorporate people into the design of their CGI characters more efficiently. Instead of doing a full CG recreation of an actor’s head and spending months getting it to look photo-real, Digital Domain uses AI to transfer an actor’s performance to digital characters exactly. They used this technology to recreate Josh Brolin’s facial expressions for Thanos in the Avengers: Infinity War and Avengers: Endgame films, producing a more realistic performance of the character and reducing the post-production time for the film overall.

AI in Post-production

Watson also helps directors with rapid-fire editing, where nothing is left on the cutting room floor. Digital cameras are kept rolling all the time because Watson can scan through all the recorded content to help directors identify which scenes are best in a film. Directors know which performance works best to tell the story, and Watson helps them deliver it more efficiently.

Once a film is “wrapped”, the production is only half done. Next up is distribution and marketing of the film, another area where Watson shines. Watson is a pro at analyzing audiences and fan bases to create a marketing strategy to promote films most effectively. From identifying scenes or story points fans will enjoy most to suggesting social media messages, Watson can do it all in post-production.

Technology is changing the way movies are done. From start to finish, technology is helping shape the way filmmakers and other creative artists express themselves, making the entire process more efficient and smoother. As artists dive into technology to create new experiences for audiences, who knows where films will take us in the next 20 years.

Learn more about technology and storytelling with Karen Palmer’s session for AT&T Shape, AI Voyage: Can Conscious Storytelling Save Us?


Share this post