XRAI-E: Extended Reality with Artificial Intelligence in Education
Project Overview
The XRAI-E project aims to develop an intelligent virtual agent–assisted eXtended Reality (XR) platform, encompassing both Virtual and Augmented Reality, to enhance learning experiences and task performance in immersive online education and training environments. By integrating cognitive-affective modeling with advanced AI and adaptive spatial interactions, the platform will enable the next generation of online education systems to dynamically respond to individual learners, significantly improving engagement, learning efficiency, and overall performance. Insights from this research will be translated into real-world applications, advancing both educational practices and professional training.
Research Objectives
Pillar 1: Holistic Cognitive-Affective Modeling
The first pillar focuses on establishing a comprehensive cognitive-affective model that integrates key human factors, including emotion, cognition, attention, and perception. This model leverages non-invasive biometric data such as brain activity, physiological responses, and non-verbal behavioral cues like eye gaze patterns and facial expressions.
Pillar 2: Adaptive Spatial User Interface and Interaction
The second pillar emphasizes the development of adaptive and effective spatial user interface systems for designing immersive XR experiences. By integrating with advanced AI models, these interfaces support dynamic spatial interactions that intelligently respond to user behaviors and context.

