A new way for people to receive tactile feedback in virtual reality uses the lips.
Lips, together with the gums and tongue, are second only to the fingertips in nerve density.
The new system uses airborne ultrasound waves to create sensations on the lips, teeth, and tongue, and is small and light enough to attach to the bottom of virtual reality (VR) goggles.
Imagine a VR world that has a drinking fountain in it. “Every time you lean down and think you should be feeling the water, all of a sudden you feel a stream of water across your lips,” says Vivian Shen, a second-year PhD student in the Robotics Institute at Carnegie Mellon University. “It’s cool. It makes the experience much more immersive.”
Working in the Future Interfaces Group (FIG), Shen and Craig Shultz, a postdoctoral fellow in the Human-Computer Interaction Institute (HCII), have also used the system to create such haptic effects as raindrops, mud splatter, and crawling bugs.
Shen and Shultz developed the system with Chris Harrison, associate professor in the HCII and director of the FIG Lab. Shen will present the team’s research on Monday, May 2, at the Association for Computing Machinery’s Conference on Human Factors in Computing Systems (CHI 2022).
Though the mouth is known for its sensitivity, researchers have had difficulty finding a means for rendering haptic effects on it, Shen says. VR users don’t like to put things in or cover their mouths. Some devices are large and unwieldy. One recent effort employed a tiny robotic arm that could flick a feather across the lips or spray them with water, but Shen notes that will not be practical for widespread use.
Ultrasound, however, has been used by researchers to deliver sensations to the hands, enabling them to create haptic effects such as virtual buttons that users can perceive themselves pushing. Because ultrasound waves can travel through the air—though only for short distances—they seemed a possible solution for mouth-based haptics, Shen says.
Most people are familiar with medical ultrasound imaging and probes, though these devices are not known for vibrating or otherwise stimulating the skin. It is nevertheless possible to use these acoustic waves, which have frequencies above the threshold of human ears, to create sensations by focusing them into a small area.
This effect is achieved by using multiple ultrasound-generating modules, or transducers. Like any sort of wave, the ultrasound waves produced by one transducer can interfere with those of other transducers—constructively, to amplify the waves, or destructively, to nullify them. “If you time the firing of the transducers just right you can get them all to constructively interfere at one point in space,” Shen says. In this case, they targeted those points of peak amplitude on the lips, teeth, and tongue. Subtly modulating the ultrasonic output also heightened the effect.
But the sensations are limited primarily to the hands and mouth, Shen says. “You can’t really feel it elsewhere; our forearms, our torso—those areas lack enough of the nerve mechanoreceptors you need to feel the sensation,” she adds.
The device is a phased array of 64 tiny transducers. The flat, half-moon-shaped array is attached to the bottom of VR goggles so it rests just above the mouth.
The haptic effects consist of point impulses, swipes, and persistent vibrations targeted on the mouth and synchronized with visual images. A variety of effects were evaluated using 16 volunteers. All the subjects reported that the mouth haptics enhanced their VR experience.
“Without haptics, it was difficult to tell when things were supposed to be touching my face,” one volunteer said.
Not all effects were equally useful. Ones that are mouth-specific—such as brushing teeth, feeling raindrops from an open window, or feeling a bug walk across the lips—were most successful. Others, such as the feel of walking through cobwebs, proved interesting, but because people expected to feel those sensations over a large part of the body and not just the mouth, the effect was less powerful.
Even drinking from a water fountain, which is mouth-oriented, could be a little disorienting, Shen observed. “It’s weird because you feel the water but it’s not wet,” she says.
“Our phased array strikes the balance between being really expressive and being affordable,” Shen says. Further work could add new haptic effects to the catalog and make the device smaller and lighter.
Source: Carnegie Mellon University