PEER-REVIEWED CONFERENCE PUBLICATIONS

Wizard of Props: Mixed Reality Prototyping with Physical Props to Design Responsive Environments

Driven by the vision of future responsive environments, where everyday surroundings can perceive human behaviors and respond through intelligent robotic actuation, we propose Wizard of Props (WoP): a human-centered design workflow for creating expressive, implicit, and meaningful interactions. This collaborative experience prototyping approach integrates full-scale physical props with Mixed Reality (MR) to support ideation, prototyping, and rapid testing of responsive environments. We present two design explorations that showcase our investigations of diverse design solutions based on varying technology resources, contextual considerations, and target audiences. Design Exploration One focuses on mixed environment building, where we observe fluid prototyping methods. In Design Exploration Two, we explore how novice designers approach WoP, and illustrate their design ideas and behaviors. Our findings reveal that WoP complements conventional design methods, enabling intuitive body-storming, supporting flexible prototyping fidelity, and fostering expressive environment-human interactions through in-situ improvisational performance.

Yuzhen Zhang*, Ruixiang Han*, Ran Zhou*, Peter Gyory, Clement Zheng, Patrick C. Shih, Ellen Yi-Luen Do, Malte F Jung, Wendy Ju, and Daniel Leithinger. 2024. Wizard of Props: Mixed Reality Prototyping with Physical Props to Design Responsive Environments. In Proceedings of the Eighteenth International Conference on Tangible, Embedded, and Embodied Interaction (TEI '24). Association for Computing Machinery, New York, NY, USA, Article 47, 1–15. https://doi.org/10.1145/3623509.3633395 (*equal contribution)

[PDF] [Presentation] [Demo1] [Demo2]

 

TactorBots: A Haptic Design Toolkit for Out-of-lab Exploration of Emotional Robotic Touch

Emerging research has demonstrated the viability of emotional communication through haptic technology inspired by interpersonal touch. However, the meaning-making of artificial touch remains ambiguous and contextual. We see this ambiguity caused by robotic touch’s "otherness" as an opportunity for exploring alternatives. To empower emotional haptic design in longitudinal out-of-lab exploration, we devise TactorBots, a design toolkit consisting of eight wearable hardware modules for rendering robotic touch gestures controlled by a web-based application. We deployed TactorBots to 13 designers and researchers to validate its functionality, characterize its design experience, and analyze what, how, and why alternative perceptions, practices, contexts, and metaphors would emerge in the experiment. We provide suggestions for designing future toolkits and field studies based on our experiences. Reflecting on the findings, we derive design implications for further enhancing the ambiguity and shifting the mindsets to expand the design space.

Ran Zhou, Zachary Schwemler, Akshay Baweja, Harpreet Sareen, Casey Lee Hunt, and Daniel Leithinger. 2023. TactorBots: A Haptic Design Toolkit for Out-of-lab Exploration of Emotional Robotic Touch . In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (CHI ’23), April 23–28, 2023, Hamburg, Germany. ACM, New York, NY, USA, 19 pages. https://doi.org/10.1145/3544548.3580799

[PDF] [Presentation] [Tease] [Video Figure]

 

EmotiTactor: Exploring How Designers Approach Emotional Robotic Touch

In this work, we bring designers into the exploration of emotional robotic touch, discuss their design decisions and reflect on their insights. Prior psychology findings show humans can communicate distinct emotions solely through touch. We hypothesize that similar effects might also be applicable to robotic touch. To enable designers to easily generate and modify various types of affective touch for conveying emotions (e.g., anger, happiness, etc.), we developed a platform consisting of a robotic tactor interface and a software design tool. When conducting an elicitation study with eleven interaction designers, we discovered common patterns in their generated tactile sensations for each emotion. We also illustrate the strategies, behaviors, and metaphors that the designers deployed in the design process. Our findings uncover that the “otherness” of robotic touch broadens the design possibilities of emotional communication beyond mimicking interpersonal touch.

Ran Zhou, Harpreet Sareen, Yufei Zhang, and Daniel Leithinger. 2022. EmotiTactor: Exploring How Designers Approach Emotional Robotic Touch. In Designing Interactive Systems Conference (DIS '22). Association for Computing Machinery, New York, NY, USA, 1330–1344. https://doi.org/10.1145/3532106.3533487 Best Pictorial Award Honorable Mention

[PDF] [Presentation] [Tease] [Demo]

 

PEER-REVIEWED JOURNAL PUBLICATIONS

Cultivating Visualization Literacy for Children Through Curiosity and Play

Fostering data visualization literacy (DVL) as part of childhood education could lead to a more data literate society. However, most work in DVL for children relies on a more formal educational context (i.e., a teacher-led approach) that limits children's engagement with data to classroom-based environments and, consequently, children's ability to ask questions about and explore data on topics they find personally meaningful. We explore how a curiosity-driven, child-led approach can provide more agency to children when they are authoring data visualizations. This paper explores how informal learning with crafting physicalizations through play and curiosity may foster increased literacy and engagement with data. Employing a constructionist approach, we designed a do-it-yourself toolkit made out of everyday materials (e.g., paper, cardboard, mirrors) that enables children to create, customize, and personalize three different interactive visualizations (bar, line, pie). We used the toolkit as a design probe in a series of in-person workshops with 5 children (6 to 11-year-olds) and interviews with 5 educators. Our observations reveal that the toolkit helped children creatively engage and interact with visualizations. Children with prior knowledge of data visualization reported the toolkit serving as more of an authoring tool that they envision using in their daily lives, while children with little to no experience found the toolkit as an engaging introduction to data visualization. Our study demonstrates the potential of using the constructionist approach to cultivate children's DVL through curiosity and play.

S. Sandra Bae, Rishi Vanukuru, Ruhan Yang, Peter Gyory, Ran Zhou, Ellen Yi-Luen Do, Danielle Albers Szafir 2022. Cultivating Visualization Literacy for Children Through Curiosity and Play. IEEE Transactions on Visualization and Computer Graphics (IEEE VIS 2022) doi: 10.1109/TVCG.2022.3209442.

[PDF] [Video]

 

PEER-REVIEWED DEMO AND WIP PUBLICATIONS

Demonstrating TactorBots: A Haptic Design Toolkit for Exploration of Emotional Robotic Touch

Emerging research has demonstrated the viability of emotional communication through haptic technology inspired by interpersonal touch. However, the meaning-making of artificial touch remains ambiguous and contextual. We see this ambiguity caused by robotic touch’s "otherness" as an opportunity for exploring alternatives. To empower designers to explore emotional robotic touch, we devise TactorBots. It contains eight plug-and-play wearable tactor modules that render a series of social gestures driven by servo motors. Our specialized web GUI allows easy control, modification, and storage of tactile patterns to support fast prototyping. Taking emotional haptics as a "design canvas" with broad opportunities, TactorBots is the first "playground" for designers to try out various tactile sensations, feel around the nuanced connection between touches and emotions, and come up with creative imaginations.

Ran Zhou, Zachary Schwemler, Akshay Baweja, Harpreet Sareen, Casey Lee Hunt, and Daniel Leithinger. 2023. Demonstrating TactorBots: A Haptic Design Toolkit for Exploration of Emotional Robotic Touch . In Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems (CHI EA ’23), April 23–28, 2023, Hamburg, Germany. ACM, New York, NY, USA, 5 pages. https://doi.org/10.1145/3544549.3583897 Best Demo Award Runner Up

[PDF] [Video]

 

HexTouch: A Wearable Haptic Robot for Complementary Interactions to Companion Agents in Virtual Reality

We propose a forearm-mounted robot that performs complementary touches in relation to the behaviors of a companion agent in virtual reality (VR). The robot consists of a series of tactors driven by servo motors that render specific tactile patterns to communicate primary emotions (fear, happiness, disgust, anger, and sympathy) and other notification cues. We showcase this through a VR game with physical-virtual agent interactions that facilitate the player-companion relationship and increase user immersion in specific scenarios. The player collaborates with the agent to complete a mission while receiving affective haptic cues with the potential to enhance sociality in the virtual world.

Ran Zhou, Yanzhe Wu, and Harpreet Sareen. 2020. HexTouch: A Wearable Haptic Robot for Complementary Interactions to Companion Agents in Virtual Reality. In SIGGRAPH Asia 2020 Emerging Technologies (SA '20). Association for Computing Machinery, New York, NY, USA, Article 8, 1–2. https://doi.org/10.1145/3415255.3422881

[PDF] [Video]

 

HexTouch: Affective Robot Touch for Complementary Interactions to Companion Agents in Virtual Reality

There is a growing need for social interaction in Virtual Reality (VR). Current social VR applications enable human-agent or interpersonal communication, usually by means of visual and audio cues. Touch, which is also an essential method for affective communication, has not received as much attention. To address this, we introduce HexTouch, a forearm-mounted robot that performs touch behaviors in sync with the behaviors of a companion agent, to complement visual and auditory feedback in virtual reality. The robot consists of four robotic tactors driven by servo motors, which render specific tactile patterns to communicate primary emotions (fear, happiness, disgust, anger, and sympathy). We demonstrate HexTouch through a VR game with physical-virtual agent interactions that facilitate the player-companion relationship and increase the immersion of the VR experience. The player will receive affective haptic cues while collaborating with the agent to complete the mission in the game. The multisensory system for affective communication also has the potential to enhance sociality in the virtual world.

Ran Zhou, Yanzhe Wu, and Harpreet Sareen. 2020. HexTouch: Affective Robot Touch for Complementary Interactions to Companion Agents in Virtual Reality. In 26th ACM Symposium on Virtual Reality Software and Technology (VRST '20). https://doi.org/10.1145/3385956.3422100 Best Demo Award

[PDF] [Video]

 

EmotiTactor: Emotional Expression of Robotic Physical Contact

The study of affective communication through robots has primarily been focused on facial expression and vocal interaction. However, communication between robots and humans can be significantly enriched through haptics. In being able to improve the relationships of robotic artifacts with humans, we posed a design question-What if the robots had the ability to express their emotions to humans via physical touch? We created a robotic tactor (tactile organ) interface that performs haptic stimulations on the forearm. We modified timing, movement, and touch of tactors on the forearm to create a palate of primary emotions. Through a preliminary case study, our results indicate a varied success in individuals being able to decode the primary emotions through robotic touch alone.

Ran Zhou and Harpreet Sareen. 2020. EmotiTactor: Emotional Expression of Robotic Physical Contact. In Companion Publication of the 2020 ACM Designing Interactive Systems Conference (DIS' 20 Companion). Association for Computing Machinery, New York, NY, USA, 151–156. https://doi.org/10.1145/3393914.3395891

[PDF] [Video]

 

Workshop Organization

Electro-actuated Materials for Future Haptic Interfaces

Electro-actuated materials (EAMs) have received wide attention within material science and soft robotics for their ability to dynamically change physical properties, such as shape and stiffness, in response to electrical stimuli. While researchers have begun exploring the haptic characteristics of EAMs, their integration into Human-Computer Interaction (HCI) shows challenges, including limited commercial availability and a lack of interdisciplinary knowledge exchange. This workshop specifically focuses on electrostatic (ES), soft electrohydraulic (SEH), and electroosmotic (EO) actuators. By bringing together researchers in the field, we aim to facilitate the exchange of findings, techniques, fabrication practices, and tacit knowledge within the HCI community. The workshop combines interactive demos, focused discussions, and hands-on ideation, providing a platform to explore the haptic potential of EAMs, identify key challenges and opportunities, and envision how these programmable materials can unlock new haptic interactions and interfaces.

Daniel Leithinger, Ran Zhou, Eric Acome, Ahad Mujtaba Rauf, Teng Han, Craig Shultz, and Joe Mullenbach. 2023. Electro-actuated Materials for Future Haptic Interfaces. In The 36th Annual ACM Symposium on User Interface Software and Technology (UIST ’23 Adjunct), October 29–November 01, 2023, San Francisco, CA, USA. ACM, New York, NY, USA, 3 pages. https://doi.org/10.1145/3586182.3617434

[PDF] [Website]