5 Things You Need to Know About AI, AR, and VR
Adobe recently hosted a panel discussion on AI and immersive design at its headquarters during San Francisco Design Week. Led by Jamie Myrold, vice president of design at Adobe, the panel also featured AI designer and creative leader Ana Arriola, who’s just joined Microsoft as partner, design director, and formerly at Facebook, Samsung, and Apple (where she worked on the first iPhone); Saschka Unseld, co-founder and creative director at Oculus Story Studio, previously at Pixar; and interaction and graphics programmer Sagar Patel, who specializes in the design and creation of generative audio-reactive experiences.
Technology has progressed significantly in recent years, the panel of pioneers agreed, and all the different disciplines — artificial intelligence (AI), augmented reality (AR), and virtual reality (VR) — are merging and evolving, not least because of how fast machine learning is improving.
Here are five things we took away from the panel.
1. VR revolutionizes storytelling.
The opportunities of immersive design for storytelling are huge.
“The general consensus used to be that you can’t tell stories in VR, and that was just four years ago,” Saschka pointed out. “The first pieces we created were to prove that wrong, and show that it can be like immersive theater. Then we convinced people it wasn’t just for gamers but that it can be like Pixar, and that was [Emmy Award-winning VR film] “Henry.” “Dear Angelica” then was the first step towards telling stories differently.”
“Dear Angelica,” which Saschka directed, was Oculus’s first VR experience made with the company’s Quill tool, which lets illustrators create immersive 3D animations directly within VR. The short film is a journey through the magical and dreamlike ways we remember our loved ones.
As a storyteller, Saschka explained, you’re not so much in control of the viewer’s micro experience (because they can look around and do things), but rather you control the macro experience as you curate spaces, decide what people can interact with, and shift them to new environments.
“There is no medium that could be more of this time to tell stories or be utilized for creativity,” Saschka said. It’s the reason behind content, technology, and research studio Tomorrow Never Knows, which he co-founded to focus exclusively on emerging technology and explore the future of storytelling.
2. You can create immersive experiences without headsets.
Sony has conducted an R&D experiment called Immersive Space Entertainment that explored new possibilities for people to immerse themselves in VR and AR environments. They showcased three experiences for the concept at SXSW last year: 360 Movie, which gives users intuitive control over 360-degree video; Music Visualizer, which renders sound in a visual format; and Cyber Gym, an interactive space that integrates a stationary bike with moving visual projections on a dome-shaped screen.
“We wanted to create an immersive VR experience without requiring you to wear anything — no headset, no glasses, no gloves,” Sagar said. “We had a chair, an exercise bike, a two-meter-wide screen, and two 4K projectors.”
The projectors cast images on the dome around the user, who’d see 180 degrees of the visuals and the rest when they turned the chair. For the Cyber Gym, the system tracked the user’s leaning motion, pedaling speed, and other movements and then used that input to modify the video surrounding the rider.
“It was incredible because it was a really social experience,” Sagar remembered. “It really involved people, whereas VR can be fairly isolating. Our interaction points were fairly limited, though, while in VR the tracking is always on point. For example, the chair needed a sensor to detect the orientation. We really had to fine-tune it to make sure it felt really good. We also spent a lot of time on making sure that when you move the pedals of the bike, you feel like you’re going through that space. Getting that right was an incredible step forward. Combined with the frame rate it made a huge difference in you feeling immersed in the space and really losing yourself in it.”
3. AI can improve people’s lives.
Creative AI uses AI to improve people’s lives rather than productivize it for the sake of it.
Ana said that AI can have wonderful applications in the fashion, furniture, and automotive industries, and, by way of an example pointed to Samsung’s new lifestyle TV, The Frame, that she helped develop. Rather than just being a black rectangle, this TV set displays digital art pieces and looks like a picture frame hanging on a wall. The ambient mode, which shipped on all Q series devices this year, can even mimic the texture of the wall behind it.
“We looked for an innovation to a problem that existed in people’s lives,” Ana said. “Now the TV becomes like a fireplace. It’s a gathering point in the home. This once black rectangle is now bringing culture into one’s family or environment.”
“The ambient mode was actually a happy accident. A lot of it was blood, sweat, and tears of research and testing. Fortunately, we had really rich testing suites to emulate a home or a kitchen. It was crucial to the design process.”
4. There are sunglasses that know what you’re doing.
Imagine wearing sunglasses that know what you’re looking at. It’s now possible with Audio AR, which adds an audible layer of information to everyday activities. Your experience changes as you turn your head.
“It’s amazing,” Saschka said. “AR headphones were joked about but they’re actually like glasses that can register rotation. The millimeter-accurate positioning means I could sit down on a chair and hear thoughts of a person that in my narrative sat there a year ago, and as I walk away, the sound becomes quieter. You could create a whole narrative overlay of a city purely in audio. It’s hugely fascinating.”
The technology was announced by Bose, mostly known for its high-end headphones, at SXSW this year. Use cases of Bose AR (see what they did there?) include the simulation of historic events at landmarks while you’re traveling, providing more information about a painting you’re approaching in a museum, translating a sign you’re reading, or helping you navigate an airport.
5. Our data is biased.
AI, which requires large data sets to power neural networks, is flawed because all of our data is inherently biased.
Algorithms would, for example, link an image of a kitchen with a woman, which needs to be addressed. “At Facebook we always asked ourselves, ‘Why are we doing this?’ and ‘Should we be doing this?,’” Ana said, revealing that there is a whole organization inside of Facebook that’s been set up to start looking at societal issues. Companies are now hiring consultants to retrain the systems and data sets.
“We might think a data-based system is neutral,” Saschka agreed, “but unless you actively countersteer, the data will just reinforce what’s there.”
“We need to ask ourselves if the next round of capital that we’re going to raise is going to change the direction or the ethics of the company,” Ana said. “Don’t be silent, express your voice, and be sure you maintain the authenticity within your craft. That’s going to be really important.”
Without a doubt, immersive technologies like AI, AR, and VR are driving the next wave of design, and opening up creative opportunities that just a few years ago were the realm of science fiction. Some of it is still a little too rudimentary, and we need to be mindful of what kind of data we feed our systems, but traditionally very tech-heavy industries are now turning to creatives to create experiences that have an impact on the outside world and bring a human aspect to the technology.
The time is now to really embrace the technology, the panel agreed. “When you pitched a project, people used to ask why I was using VR,” Saschka said. “I always found that so offensive because no one asked you why you’ve written a book and not directed a movie. You need to embrace what’s unique about the medium.”
For more UX insights sent straight to your inbox, sign up for Adobe’s experience design newsletter.