mobile icon
Project

Multimodal Interaction in virtual reality

WorkgroupMultimodal Interaction lab
Duration2018-open
FundingIWM budget resources
Project description

The technical development in the field of Extended Realities (e.g., Virtual Reality or 360°-Visualizations currently shows a strong dynamic development. The high degree of immersion of this technology makes it possible to create for the user an intense feeling of presence in the virtual world. It is assumed that cognitive processes create a mental model of the self in the virtual world, which generates a feeling of "being in the virtual world". However, it is unclear from a cognitive science and learning psychology perspective in what ways this form of immersion can benefit the user in knowledge contexts.


Interaction in highly immersive virtual reality also poses requirements and offers opportunities that may necessitate innovative interface designs. For example, the body can move freely in space and be used accordingly for interaction. This freedom considerably expands the interaction spectrum, e.g., compared to touch interfaces. In application development, there are various approaches to realization, which have, however, rarely been investigated in terms of learning psychology and cognitive science. The use of virtual reality technology thus offers interesting and novel possibilities with respect to the design of learning environments and to the supporting information processing by reducing the barriers between the agent and the information object.


However, the question arises how interaction designs should be created in this environment in order to support information processing processes in an optimal way. The following example is intended to illustrate these points: Previous research with affective stimuli has shown that certain interactions, e.g., moving affective images on a touch display, can influence the downstream valence evaluation of these images (e.g., Cervera Torres, Ruiz Fernández, Lachmair, & Gerjets, 2018; cf. Casasanto, 2009). In contrast, however, much more expansive interaction patterns are possible in virtual reality, e.g., opening arms functionally linked to zooming affective images. For example, one of our recent studies showed that positive and negative images were evaluated more positively and negatively, respectively, when zoomed in after an opening arm gesture compared to when zoomed out after a closing arm gesture. However, this was not observed when the opening arm gesture was functionally linked to making the image smaller and the closing arm gesture was functionally linked to making the image larger (Lachmair, Ruiz Fernandez, Cervera Torres, & Gerjets, 2017). This shows that interactions should not be designed arbitrarily. Further studies will investigate how the interactions also affect learning and memory performance of the subjects.

Publikationen
  • Lachmair, M., Fischer, M. H., & Gerjets, P. (2022). Action-control mappings of interfaces in virtual reality: A study of embodied interaction. Frontiers in Virtual Reality, 3, Article 976849. https://dx.doi.org/10.3389/frvir.2022.976849
  • Lachmair, M., Ruiz Fernández, S., & Gerjets, P. (2018). Does Grammatical Number influence the semantic priming between number cues and words related to vertical space? An investigation using Virtual Reality. Frontiers in Psychology, 9:573. https://dx.doi.org/10.3389/fpsyg.2018.00573 
  • Gerjets, P., Brucker, B., & Lachmair, M. (2023, April 5). Embodied processing in VR. Invited Talk at the Colloquium Future of Education in Virtual and Augmented Realities (FEVAR) series. Arizona State University, USA. [Talk]
  • Daltoè, T., Ruth-Herbein, E., Brucker, B., Jaekel, A.-K., Trautwein, U., Fauth, B., Gerjets, P., & Göllner, R. (in press). Immersive insights: Unveiling the impact of 360-degree videos on preservice teachers’ classroom observation experiences and teaching-quality ratings. Computers & Education. https://dx.doi.org/10.1016/j.compedu.2023.104976