A university-sponsored research center funded by the Office of Naval Research, U.S. Army Research Laboratory and the National Science Foundation, the Institute for Creative Technologies has been pushing the envelope of virtual and mixed reality for 15 years.
Blueshark incorporates technology from dozens of projects developed by ICT scientists and engineers working at the institute’s Playa Vista headquarters and its Del Rey warehouse laboratory.
ICT’s ultimate focus, however, is less about the microchips and wires that make virtual reality work than it is about exploring how that technology is experienced by the end user.
THE FUTURE IS CALLING
“Nice to meet you,” says Brad, a reassuringly calm and nondescript 35-year-old therapist who has just placed a Skype call to Mixed Reality Lab Associate Director David Michael Krum’s cell phone.
Brad goes on to say he’s a good listener, and he is — not once interrupting our facetious answers to his increasingly personal questions about our lives.
Only Brad isn’t real. At least not in the traditional sense of the word. He’s a “digital human” who interacts with his therapy patients through a realistic speech algorithm.
“People react to virtual humans as if it’s a person, but you also have this notion that it’s a computer, so surprisingly you’re more willing to talk about your personal issues. For our study, the questions actually become uncomfortably personal to see how far people are willing to go,” Krum says.
Back at ICT’s Playa Vista headquarters, one of Brad’s colleagues — a digital human trained to screen soldiers and Marines returning from the battlefields of Iraq and Afghanistan for symptoms of post-traumatic stress disorder — speaks through a flat-screen monitor. Only she’s equipped with a camera that scans her subject’s face for nonverbal cues such as eye contact, smiles and muscle tension.
In addition to his therapy role, Brad could also be programmed to answer customer service calls at credit card and cable companies.
Using other ICT technology, Brad might instead become you.
BRINGING AVATARS TO LIFE
It won’t be long before anyone with a $99 Microsoft Kinect and a basic PC can scan a recognizable digital copy of him or herself and import it into a video game, social media site or YouTube video.
ICT researchers Evan A. Suma and Ari Shapiro are already doing it.
In five minutes or less.
“The real thing we’re excited about is not just being able to capture what you look like, but to actually bring that figure to life and start doing real behaviors, becoming almost a virtual surrogate for yourself. You buy an X-Box game, do a scan, and the next thing you know you’re the Jedi swinging a light saber,” Suma says.
ICT was already a leader in avatar and motion-capture technology.
A 20-foot-tall spherical body-scanning light stage in the Del Rey laboratory uses lasers and 6,666 controllable LED lights to create photo realistic digital avatars that can be imported into computer design software, explains ICT Graphics Lab Producer Kathleen Haase. That’s how Michael Caine and Dwayne “The Rock” Johnson rode a bee in the 3-D film “Journey 2: The Mysterious Island,” she adds.
Haase’s team is also working with Stephen Spielberg’s USC Shoah Foundation to record lifelike avatars of Holocaust survivors as they recount their experiences in Nazi Germany.
Using natural speech algorithms like those that power Brad, the technology will eventually allow people to converse with a survivor’s digital avatar in a mixed reality setting, getting precise answers to unique questions based on keyword triggers.
“We’re recording responses to about 1,000 different questions so you can preserve the ability to ask these questions and get answers to them in a personal way long after that survivor passes,” Haase says.
The purpose of rapid avatar scanning is not to attain such perfection, Shapiro explains, but to bring the technology to the masses.
“We see it as a democratization of this ability,” Shapiro says. “What we are showing is that using this ‘old’ technology, you can get a serviceable self-simulation and participate in virtual reality at almost no cost.”
After our conversation with Brad, Nelson and Krum contemplate whether advertisers might in the future adopt direct-media strategies in which consumers could witness a digital avatar of themselves participating in promoted activities —wearing a new outfit, for example, or drinking a Coke.
“There are some weird advertising things people are going to try,” says Nelson. “It could mess with your memory.”
The synthetic humans of “Blade Runner” come to mind.
“I agree those are kind of nefarious purposes,” says Krum, but he adds that this is the whole point of exploring the realm of possibilities in an academic setting.
“We want to be aware of these things before we get deep into it. We don’t want to be surprised,” he says.
BRINGING THE STORY FULL-CIRCLE
Digital people like Brad could also be made into Wild West villains.
As part of ITC’s work to develop interactive virtual training environments for soldiers and Marines, researchers have designed an 1800s saloon in which users can interact — and to some degree converse — with a bartender, damsel in distress and an evil gun-slinging outlaw projected on the walls. It’s a choose-your-own adventure progression, but most sequences end with a duel in which users draw and shoot a toy gun to settle the score.
The technology’s primary limitation is that it can suit only one user at a time. But Nelson’s team is working on a solution for that — creating what basically amounts to a prototype of the Star Trek holodeck.
Employing movement trackers and head-mounted projections reflected by retro-reflective cloth instead of the more typical virtual reality goggles, in this space groups of people can interact with digital characters — and each other — on a one-to-one basis.
The technology is already being put to use for a military “shoot house” training simulator in San Diego, Nelson says.
“We all come into the room and there’s a person projected on that screen, and if they’re a good guy we don’t do anything and if they’re a bad guy we can shoot. But as you know from [standard] filmmaking, if the character has eye contact with you he has eye contact with everyone, and that has an implication if the guy is pointing a gun at you,” he says. But with this technology, “the figure on screen can switch gaze from one person to another and they’ll both see that accurately. Because we know where you are in the space, we’re generating a separate image.”
Mark Bolas, director of ICT’s Mixed Reality Lab and an associate professor at the USC School of Cinematic Arts, believes writing the script for virtual and mixed reality experiences is as important as the technology itself.
“As humans we can’t help but generate story, because that’s how we make sense out of the world. So as we synthesize more world, we have to be sensitive to the story we’re creating as well,” Bolas says.
“This lab is all about how to merge the virtual world with the real world — that’s why we call it mixed reality. You don’t want to let go of the real world. It’s always there with you, so you have to figure out how to use it in what you’re doing,” he adds.
Outside of the ICT’s Mixed Reality Lab is Silicon Beach, the burgeoning Westside tech sector that already meshes creative content with technological advances. Inseparable from the entertainment industry, L.A.’s tech sector is the perfect setting for mixed reality exploration, Bolas says.
“Los Angeles is about people who create experiences, not things. When you’re watching a movie, they don’t want you to know they used the latest and greatest technology to get that shot; they just want you to experience it,” he says. “In the end, that’s what we’re about, too. Even for hardcore Department of Defense training applications, it’s not the technology in front of the person but whether they have the experience you need them to have.”