I'm Vivian, a PhD Student at Carnegie Mellon University.

I'm a third year in the Robotics Institute doing research with the Future Interfaces Group, advised by Prof. Chris Harrison. My research builds on my background in embedded systems, sensing, and computer vision, currently focused on haptics and interaction.

I am a Swartz Entrepreneurial Fellow and NSF GRFP Honorable Mention, and have received two Best Paper Awards at premier venues in human-computer interaction.

In my free time you can find me taking photos and chasing plastic!


Selected Research

V Shen, C Shultz, C Harrison

CHI 2022

🏆 Best Paper Award

Mouth Haptics in VR using a Headset Ultrasound Phased Array

Today’s consumer virtual reality systems offer limited haptic feedback via vibration motors in handheld controllers. Rendering haptics to other parts of the body is an open challenge, especially in a practical and consumer-friendly manner. The mouth is of particular interest, as it is a close second in tactile sensitivity to the fingertips. In this research, we developed a thin, compact, beamforming array of ultrasonic transducers, which can render haptic effects onto the mouth. Importantly, all components are integrated into the VR headset, meaning the user does not need to wear an additional accessory or place any external infrastructure in their room. Our haptic sensations can be felt on the lips, teeth, and tongue, which can be incorporated into new and interesting VR experiences.

K Ahuja, V Shen, C Fang, N Riopelle,

A Kong, C Harrison

CHI 2022

ControllerPose: Inside-Out Body Capture with VR Controller Cameras.

We present a new and practical method for capturing user body pose in virtual reality experiences: integrating cameras into handheld controllers, where batteries, computation and wireless communication already exist. By virtue of the hands operating in front of the user during many VR interactions, our controller-borne cameras can capture a superior view of the body for digitization. We developed a series of demo applications illustrating the potential of our approach and more leg-centric interactions, such as balancing games and kicking soccer balls.

V Shen, J Spann, C Harrison

SUI 2021

🏆 Best Paper Award

FarOut Touch: Extending the Range of ad hoc Touch Sensing with Depth Cameras
The ability to co-opt everyday surfaces for touch interactivity has been an area of HCI research for several decades. In the past, advances in depth sensors and computer vision led to step-function improvements in ad hoc touch tracking. However, progress has slowed in recent years. We surveyed the literature and found that the very best ad hoc touch sensing systems are able to operate at ranges up to around 1.5 m. This limited range means that sensors must be carefully positioned in an environment to enable specific surfaces for interaction. Furthermore, the size of the interactive area is more table-scale than room-scale. In this research, we set ourselves the goal of doubling the sensing range of the current state of the art system.

Get in touch at vhshen@cmu.edu