CREDIT: Jacki Morie
In a world where we have become increasingly reliant on applications like Zoom and social media platforms for connectivity, the lines are blurring between us and our digital avatars – the icons, profile images, and 3D characters that represent us online. In this climate, it’s easy to see the benefits of a more physical avatar, one that exists beyond the digital realm and is able to transport our physicality to a remote location in real-time. Robotic avatars that would allow us to attend meetings or hug our families from hundreds of miles away, as though we were really there.
These avatars are not just an idea, but a part of our future. Dr. Jacquelyn “Jacki” Morie has been working on the ANA Avatar XPRIZE – our competition to accelerate this technology – as a technical consultant for three years. “Some kids have grown up using digital avatars all their lives, they’re part of a generation that's used to being an avatar in their experiences, but what a digital avatar does not give you is that feeling of physicality,” she explains, “and interacting with someone’s digital avatar does not give the sense that there is a physical being right there with you.”
How do we achieve this physicality? The answer, explains Jacki, lies in the same way humans achieve a sense of presence: by not just being in a place but through the senses. A remote multi-sensory experience is key to feeling like you’re really in the location, and it’s what will distinguish the experience of using physical avatars from digital avatars. “We miss so much in our digital worlds. We miss that sense of touch… particularly over the last year.” According to Jacki, this is where physical robotic avatars can give us “so much more.”
Sound and vision
Right now, we are at the very early stages of avatar technology. There is a long road ahead when it comes to developing avatars to be more human-like, for a better sense of human-to-human interaction, and also to improve the capabilities of remote vision, hearing, touch, scent, and taste. Avatars currently tend to “look a little sci-fi”, jokes Jacki, “they're metallic and they've got a lot of gears”, meaning there is a lot of exciting potential for improvement.
Sound and vision via avatars is already possible – not so different from cell phones, right? – but what some teams in the ANA Avatar XPRIZE are working on is a spatialized component, says Jacki – for instance, if the person whispers in your avatar’s ear, you might hear its proximity.
As for vision, “that could be just as simple as seeing through a video camera attached to the avatar” Jacki explains, “or it could be technology that gives you a broader field view, like 180-degree vision, or tech that gives you more sensory awareness of the spatial qualities of the remote location.” In other words, we can see and hear via avatars, but this will only sharpen over time as audiovisual technologies improve.
Touch
When it comes to the other senses, things get more challenging. Haptics refers to the use of technology that stimulates the senses of touch and motion, and it is crucial to a sense of presence that takes us beyond, for instance, what we can see or hear through a video call.
The reason this area of technology is so complex is that there are numerous types of touch – a touch on the surface of our hand requires a different mechanism than a hug, Jacki explains, of the challenges researchers face.
Many of these experiences are “a very hot research topic” at present, Jacki says. Vibrotactile haptics – recreating sensations through vibrations and buzzing – is an area that researchers have got down, says Jacki, but there is a long way to go, particularly when it comes to conveying the finer details of touch remotely. “Our skin is also very high resolution, if you will, so if I wanted to feel the texture of something, like how soft a sweater is, and my avatar robot touches that soft sweater, the question is: what am I feeling as the person who is operating that robot?”
Smell
The next big challenge in avatar tech in years to come will be recreating smell and taste remotely. “Those senses are kind of based on chemical signatures. So they're not as easy to digitize as things like sound and vision or even haptics,” says Jacki. While we can create a smell, what becomes difficult is when we try to recreate it, especially when you consider that most people experience smell differently: “There is no RGB of smell like there is for computer graphics, where you have red, blue, and green, and you can make all colors with that. We can recognize thousands of different smells and each one of those has a unique molecular signature, so it's not like we can just store a few components and make anything out of them.”
That being said, it is currently possible to release scents so that someone can get the aroma of another space or another human being. We can also copy the signature smell of a place with Headspace technology, a way to collect the molecules in a space, analyze them in a lab, see what components there are, and then recreate it. The catch is, this is not instant. “So it's not like we can do it right away when we're using an avatar,” Jacki muses, “but it's coming along and it's something that I'm looking forward to, especially because scent adds so much to an experience.”
Taste
Finally, taste may be the hardest sense of all to recreate. “Really, we need the sensors in our tongues to come in contact with something, at least as we understand taste today,” Jacki says. Over the past ten years, there have been an array of experiments where researchers attempt this with electricity, color, heat, and scent. A combination of gels and electrolytes, for instance, can be used to simulate the intensity of the five basic flavors: sour, sweet, bitter, salt and umami.
“Many studies have had a neural focus. For instance when you bite down on something that squirts into your mouth and that is then coupled with smells – what's in your mouth takes on a different meaning because of the smell associated with it,” explains Jacki. “There's also research into how we can stimulate the brain itself to recreate taste in our mouth, whereby our brain understands that we're tasting something even if we haven't put a physical component into the mouth.” Ultimately though it's difficult to digitize taste, and it's going to take a while: “That may be the final thing that we have to overcome in the future to have a fully sensory avatar being.”
Coming to your senses… soon
If and when experts can nail this technology, the human benefits are manifold. “Overall, I think recreating these senses is important for the future because we're not going to be wanting to settle for a partial experience,” says Jacki. “If I'm going to choose between a virtual experience of a beach that lets me feel the sand on my toes and the breeze on my skin and one that is a video window into the space, I'm going to pick the more full sensory experience.” It will not just drastically improve communication but also travel and cultural exchange. Imagine, say, being able to visit Egypt, walk into a pyramid, and feel the atmosphere with all of your senses. In helping us achieve this, avatar technology will open up travel for those who currently do not have access, as well as giving people with disabilities the opportunity to access places they might not otherwise.
So how far away is integrating all five senses into avatar technology? “I think we’re a few years off,” Jacki concludes. “I certainly would like to see smell incorporated in the next ten years. It may have some research issues that keep it from being widely distributed, but I think we can get there. Taste, I think, might need some brain hacking and I don't think we're going to see that in my lifetime, but if we had smell, I'd be happy.”
In the meantime, as for sound, vision, and touch, she says: “We're excited to see how the teams in the ANA Avatar XPRIZE incorporate these different mechanisms into their avatars – it's going to be an exciting show of what's possible and where the technology is headed.”