XRAI-E: Extended Reality with Artificial Intelligence in Education

Project Overview

The XRAI-E project aims to develop an intelligent virtual agent–assisted eXtended Reality (XR) platform, encompassing both Virtual and Augmented Reality, to enhance learning experiences and task performance in immersive online education and training environments. By integrating cognitive-affective modeling with advanced AI and adaptive spatial interactions, the platform will enable the next generation of online education systems to dynamically respond to individual learners, significantly improving engagement, learning efficiency, and overall performance. Insights from this research will be translated into real-world applications, advancing both educational practices and professional training.

Research Objectives

Pillar 1: Holistic Cognitive-Affective Modeling
The first pillar focuses on establishing a comprehensive cognitive-affective model that integrates key human factors, including emotion, cognition, attention, and perception. This model leverages non-invasive biometric data such as brain activity, physiological responses, and non-verbal behavioral cues like eye gaze patterns and facial expressions. 

Pillar 2: Adaptive Spatial User Interface and Interaction
The second pillar emphasizes the development of adaptive and effective spatial user interface systems for designing immersive XR experiences. By integrating with advanced AI models, these interfaces support dynamic spatial interactions that intelligently respond to user behaviors and context. 

Experience Gained

Students and researchers participating in this project will gain hands-on experience through:

  • Developing XR and AI solutions integrating cognitive-affective models
  • Designing and deploying adaptive XR algorithms and applications
  • Working with multimodal biometric data (eye gaze, facial expression, physiological signals)
  • Disseminating results through conferences, workshops, symposia, and journals

Majors and Interests Needed

  • Computer Science 
  • Game Design and Development
  • Software Engineering
  • Information Technology/Systems
  • Data Science
  • Social Sciences
  • Psychology
  • Nursing
  • Health Sciences
  • All Engineering
  • Programming in Unity, Unreal, Nvidia Omniverse using C++, C#, Python
  • Experimental design, data collection, and analysis
  • Interest in XR, AI, cognitive science, and human-computer interaction

Team Advisors

Sungchul Jung 
Assistant Professor, Game Design and Development  
Director of Immersive Empathic Interface (IEI) Lab  
Email: sjung11@kennesaw.edu

virtual reality classroom