We are pleased to inform you about the acceptance of the following paper at XR Salento 2026 which will be published in Lecture Notes in Computer Science (LNCS) by Springer:
2026.
ReEmote: Towards Emotion Representation in VR Through Va.Si.Li-Lab. Proceedings of XR Salento 2026.
accepted.
BibTeX
@inproceedings{Schrottenbacher:et:al:2026:a,
author = {Schrottenbacher, Patrick and Mehler, Alexander and Bernhardt, Vivienne
and Rohe, Leon and Abrami, Giuseppe},
title = {ReEmote: Towards Emotion Representation in {VR} Through {Va.Si.Li}-Lab},
booktitle = {Proceedings of XR Salento 2026},
year = {2026},
publisher = {Springer International Publishing}
keywords = {VR, XR, affective computing, virtual humans, emotion detection},
abstract = {Human social interactions are inherently multimodal, shaped not
only by what speakers convey but also by cues such as facial expressions,
posture, and gestures. Together, these channels shape both participants'
perceptions and behaviors, further reinforcing conversational
feedback loops. This multimodal system extends to VR, where avatars
serve as proxies for human interaction, making both visual and
auditory fidelity essential for engaging. To properly utilize
the emotional expression space that virtual environments allow,
we introduce ReEmote. ReEmote extends the capabilities of Va.Si.Li-Lab,
a collaborative, multi-user VR platform built on Ubiq. While Va.Si.Li-Lab
supports user emotional expression through facial and hand tracking,
ReEmote extends this by introducing schema-based emotion mappings
that affect both avatars and their environments. This fosters
immersive, emotionally aware environments that are beneficial
for human and chatbot agent interactions, where human users and
virtual agents share an emotional expression space. By enabling
richer emotional dynamics, ReEmote opens up new ways of designing
affective and engaging virtual experiences.In this paper, we describe
the design choices behind ReEmote and present an evaluation of
the graphical validity of the emotion representation introduced
by ReEmote. Our results indicate that emotions can be validly
represented through avatar facial expressions that users can quickly
identify as Ekman's basic emotions.This opens up several possibilities
for extending emotion-related text-to-speech (TTS) applications
in Extended Reality (XR) with ReEmote. The paper also outlines
use cases for XR-based TTS applications.},
note = {accepted}
}
