The following new publication has been accepted:
- [1] A Multimodal Data Model for Simulation-based Learning with Va.Si.Li-Lab
Total: 1
-
A. Mehler, M. Bagci, A. Henlein, G. Abrami, C. Spiekermann, P. Schrottenbacher, M. Konca, A. Lücking, J. Engel, M. Quintino, J. Schreiber, K. Saukel, and O. Zlatkin-Troitschanskaia, “A Multimodal Data Model for Simulation-Based Learning with Va.Si.Li-Lab,” in Digital Human Modeling and Applications in Health, Safety, Ergonomics and Risk Management, Cham, 2023, pp. 539-565.
[Abstract] [BibTeX]Simulation-based learning is a method in which learners learn to master real-life scenarios and tasks from simulated application contexts. It is particularly suitable for the use of VR technologies, as these allow immersive experiences of the targeted scenarios. VR methods are also relevant for studies on online learning, especially in groups, as they provide access to a variety of multimodal learning and interaction data. However, VR leads to a trade-off between technological conditions of the observability of such data and the openness of learner behavior. We present Va.Si.Li-Lab, a VR-L ab for Simulation-based Learn ing developed to address this trade-off. Va.Si.Li-Lab uses a graph-theoretical model based on hypergraphs to represent the data diversity of multimodal learning and interaction. We develop this data model in relation to mono- and multimodal, intra- and interpersonal data and interleave it with ISO-Space to describe distributed multiple documents from the perspective of their interactive generation. The paper adds three use cases to motivate the broad applicability of Va.Si.Li-Lab and its data model.
@inproceedings{Mehler:et:al:2023:a, abstract = {Simulation-based learning is a method in which learners learn to master real-life scenarios and tasks from simulated application contexts. It is particularly suitable for the use of VR technologies, as these allow immersive experiences of the targeted scenarios. VR methods are also relevant for studies on online learning, especially in groups, as they provide access to a variety of multimodal learning and interaction data. However, VR leads to a trade-off between technological conditions of the observability of such data and the openness of learner behavior. We present Va.Si.Li-Lab, a VR-L ab for Simulation-based Learn ing developed to address this trade-off. Va.Si.Li-Lab uses a graph-theoretical model based on hypergraphs to represent the data diversity of multimodal learning and interaction. We develop this data model in relation to mono- and multimodal, intra- and interpersonal data and interleave it with ISO-Space to describe distributed multiple documents from the perspective of their interactive generation. The paper adds three use cases to motivate the broad applicability of Va.Si.Li-Lab and its data model.}, address = {Cham}, author = {Mehler, Alexander and Bagci, Mevl{\"u}t and Henlein, Alexander and Abrami, Giuseppe and Spiekermann, Christian and Schrottenbacher, Patrick and Konca, Maxim and L{\"u}cking, Andy and Engel, Juliane and Quintino, Marc and Schreiber, Jakob and Saukel, Kevin and Zlatkin-Troitschanskaia, Olga}, booktitle = {Digital Human Modeling and Applications in Health, Safety, Ergonomics and Risk Management}, editor = {Duffy, Vincent G.}, isbn = {978-3-031-35741-1}, pages = {539--565}, publisher = {Springer Nature Switzerland}, title = {A Multimodal Data Model for Simulation-Based Learning with Va.Si.Li-Lab}, year = {2023} }