At Soul Machines, a company that uses artificial intelligence to create lifelike avatars that respond to human emotion, a fair amount of their work could be considered unsettling to the average person who fears the coming takeover by our AI-robot overlords.
It’s a company that pretty much lives in the uncanny valley, that space between fake and real that can creep people out, but that’s not usually what happens when people meet BabyX, said Soul Machines founder Mark Sagar.
Instead, he says, when the baby begins to whimper or cry, some respond in human ways, demonstrating what appears to be sympathy similar to the kind they may lavish on a human baby.
“I’ll probably get about 10 or 15 percent of people respond with ‘that’s creepy,’ and others it doesn’t bother them at all. Ultimately it’s about creating an emotional connection and then people jump right into that,” he said.
To see which of these two camps you fall into, watch the video below.
Sagar is an associate professor based at the University of Auckland in New Zealand. He’s won an Academy Award for constructing lifelike animated faces for movies like King Kong and Spider-Man 2, work that began at Sagar’s Laboratory for Animate Technologies to create human movement designed not by actual human movement but by neural networks. There Sagar and staff combine fields like affective computing, bioengineering, theoretical neuroscience, and AI.
Soul Machines makes avatars ranging from a cartoon strawberry for a kids’ TV show to Nadia, an avatar voiced by actress Cate Blanchett that is able to serve Australian citizens looking for assistance from the National Disability Insurance Scheme.
Other potential applications range from autonomous characters for VR, education, entertainment and gaming, and as a virtual assistants or customer service agents.
Within the coming year to 18 months, Soul Machines plans to open a platform for people to create their own avatars, like a more realistic Bitmoji.
Potential applications of its tech are numerous, but Soul Machines decided to create an AI baby because babies are natural “learning machines” and as a way to explore the field of social learning, because the company wants to train AI the same way humans raise children.
“It’s like really looking at the basics of how parents teach children,” Sagar said. “How does that interaction loop work? Because if we can create that with a computer, we’ve actually created a very natural way for people to teach computers.”
It also helps lower performance expectations for the AI — Sagar believes there won’t be anything that approaches adult levels of cognition for a long time.
Avatars are made with biologically inspired cognitive models to give the most lifelike interaction as possible. Sagar isn’t as concerned about entering uncanny valley as he is focused on avatars establishing a deep connection.
“The brain reacts differently to something it perceives to be alive versus something which it perceives to be inanimate,” he said. “If you ever see a realistic eye looking at you, you’re much more likely to respond than if you see a cartoon eye looking at you.”
Computers that can look you in the eye and analyze your feelings
Responses to human emotion are also part of Soul Machines avatars, which can respond to human emotion it sees through cameras that track facial expression.
So should a Soul Machines avatar be in a store window or kiosk, it might look you in the eye. Look away and it could respond with body language meant to convey disappointment or sadness as a way to get your attention. A shop owner could also choose a more humorous or intellectual appeal, or portray a personality associated with their brand.
With biometrics, Soul Machines can remember faces and use AI to determine the best response based on previous interactions.
In time, affective computing could be used to create personality profiles that follow you around the world the way cookies follow you on the web — across apps, games, virtual reality, and shop windows to develop an understanding of how to best serve your customer service needs (or manipulate you).
Affective computing is technology that can detect human emotion, and it’s being used to serve people ads in supermarkets but also to improve sales or boardroom performance and, as Cogito does, help understand when veterans with PTSD need help.
In case you needed things to get even more futuristic or sci-fi, Sagar said in the future he may consider combining affective computing, avatars, and AI designed to mimic the tone, style, and word usage of people both alive and dead.
It’s an idea that has been part of the popular imagination for some time but is now coming to life in a series of products and projects.
The New Dimensions in Testimony initiative from the University of Southern California interviews Holocaust survivors and makes a hologram of them. When combined with NLP, a person can ask the hologram virtually any question about their life experience; it’s like a memoir that can talk to you.
Using old chat conversation transcripts, the startup Luka created a neural network after the death of Roman Mazurenko, a close friend of CEO Eugenia Kuyda. A near-identical scenario played out on the TV show Black Mirror when a woman allowed a company to scrape her husband’s old emails and put that avatar into a lifelike robot so she could be with him again.
It’s a phenomena Sagar calls the creation of “digital ghosts” and a “virtual spirit world.”
“I’d be very interested in combining Luka-type technology with ours and just seeing the implications of that. I think it’s really a fascinating thing,” he said. “Once you’ve built an avatar, plus you’ve got the transcripts, you essentially created a digital ghost. Essentially we’re creating this kind of virtual spirit world.”
Demos of BabyX 5.0 will begin in late 2017 or 2018, a company spokesperson said.
This post by Khari Johnson originally appeared on VentureBeat.
Tagged with: artificial intelligence