Tactorbots Prototypes

Early Prototypes and Formative Study to Find Design Heuristics

We would like to investigate the design heuristics of gesture-based robotic touch before developing the hardware toolkit. Built upon EmotiTactor, we developed robotic tactors for performing target social gestures. While the EmotiTactor device fixed the Tactors' locations on an integrated device with no specific reason, we propose studying the perception and placement of robotic gestures to inform the toolkit design. The table-grounded prototypes were designed in 1 DoF, driven by a micro servo (AGFrc B11DLS), as they are low-cost and easy to control precisely. We defined sample parameter settings on motion, rhythm, and intensity for each gesture and conducted an identification study. We also developed a testing interface that included rails and sliders to explore each tactor's appropriate placement. We invited 12 participants (7 Female, 4 Male, 1 Other) aged between 22 to 30 years old (M = 24.83, SD = 2.03). 

(a-h) Design of Tactor modules (right side) that perform each social touch gesture (left side); (i) Testing interface with sliders and rails

Line graph shows the rotational angle changes along with time for each gesture to visualize the motion functions 

 

Study procedure

GESTURE IDENTIFICATION SESSION

On arrival, the participant sat at a table with the testing setting and received the study instructions. The study protocols included: (1) Demo of studied gestures: the researcher performed all the gestures with their hand and forearm to inform the gesture definitions. (2) In the study, the participant was blindfolded, wore noise-canceling headphones, and placed their left forearm on the testing interface. (3) Familiarization: the researcher ran through all gestures at a fast speed (5s/gesture) on the participant’s forearm. (4) Researchers started to run the nine predefined touch behaviors in random order by manually changing the tactor module and triggering the function. Each behavior was performed for 15s. (5) After each round, the participant was asked to make a choice from the nine gestures on the response sheet. 

Placement Investigation Session

After finishing the first session study and saving the results, we turned to the second session. The study protocols were: (1) Following the random order in the last session, the participant was introduced with the target gesture for each tactor. (2) The introduced tactor was activated. The participant was asked to explore and report where they would like to get this stimulation along their forearm. They could place the tactor on either side of the slider and slide it on the rail. (3) The researcher documented the results by measuring the position of the motor shaft. 

 

Results

Gesture Identification Session

Overall, the participants were able to correctly identify the touch gesture with high accuracy (M = 83.4%, SD = 0.16). For gestures such as push, rub, shake, squeeze, and stroke, people can decode them with over 90% accuracy. In the matrix, we noticed the confusion mainly lay between pat & hit, pat & tap, which could be due to the similar features of the gestures. Pat and Hit share the same tactor. Furthermore, participants showed their interest and amazement after finishing the study. One participant said: “The sensations were so natural and easy to understand. They were really distinguished from each other. It is hard to imagine that they were all rendered by one micro servo!” We were encouraged by those study results and feedback and decided to continue using 1DoF servo-driven strategy and 3D printed plastic to build our robotic haptic toolkit. 

Placement Investigation Session

The figure below shows the results of the appropriate touch contact location for each robotic gesture. Some of the participants provided a certain contact spot, while some others provided a range of space. We noticed that some people did not care about the placement of some tactors. For instance, some of them told us that all forearm areas were okay with certain tactors after sliding them from wrist to elbow (see some large range highlights in the figure). Others were very particular about it and tested them carefully, sliding back and forth, trying to tell the difference between various touch areas, and providing a point that they felt was most appropriate. In the Figure, the darker the pattern color (orange) was, the more popular the location was chosen. The recommended range from the study results was highlighted with the black stroke pattern. Our results indicated that participants were more willing to place the tactors in the area that is about 12-17 cm away from the wrist. They commented the muscle (brachioradialis) was strong there, which let them feel more secure and comfortable to be touched on, especially for intense gestures such as push. Moreover, the layer with thicker muscle allowed a more significant skin deformation when encountering normal force, which led to a more perceptible sensation. Overall, we found many overlaps in contact range preferences, revealing that integrating all Tactors in a single device with fixed position and order may be challenging. Moreover, the table-grounded device restricted arm movement, leading to numbness and fatigue. Therefore, we designed our final TactorBots in a wearable modular form factor to allow flexible body placement and limb movement. Modularity also enables a partial on-demand combination of gestures for different applications.