IDP Projects
We welcome students to complete their IDP at our chair. Feel free to send us your credentials (CV, transcript) together to the given e-mail address corresponding to the projects.
You can find the list of open IDPs below.
VR Classroom Design with Generative Models

Virtual Reality (VR) technology has revolutionized the way we learn and interact with information. In this project, we aim to design a VR classroom that leverages the power of generative behavior and text models to enhance the learning experience for students. The project is presumably suitable for students from computer science: games engineering, but also for all other students who have an interest in developing VR applications that integrate artificial intelligence.
For further information write an email to ozdelsuleyman@tum.de
Assisting the remote video learner with self-regulation support

Online video learning is becoming increasingly important in educational contexts. However, remote video learning challenges not only students’ self-regulation, but also teachers‘ abilities to detect these self-regulation problems. The project addresses this problem at the interface of psychology, educational science, and computer science. To this end, potential problems of self-regulation will be automatically detected and measures to support them, for example by optimizing instructional videos, shall be developed.
For further information, write an email to anna.bodonhelyi@tum.de.
IDP: ArtisanVR

Protecting cultural variety and human creativity, preserving and transmitting intangible cultural heritage (ICH) practices and knowledge is crucial. Innovative tools like large language models (LLMs) and virtual reality (VR) have been employed more and more to develop immersive learning environments that help spread knowledge. This IDP project looks into the pedagogical approaches and designs ideas that support the creation of successful VR and LLM-based learning environments for ICH.
For further information write an email to carrie.lau@tum.de.
VR Locomotion

VR locomotion is one of the most important design features of VR applications and is widely studied, especially due to the increasing popularity of VR in entertainment and education. However, users' subconscious behaviors during VR locomotion, which offers important insights for evaluating and improving locomotion methods, have rarely been studied. The goal of this project is to answer this research question by studying users' eye movements during VR locomotion in scenarios related to our daily lives.
For further information write an email to: hong.gao@tum.de
Usable Privacy for Immersive Learning Settings

The importance of data privacy and security has recently been emphasized by various regulations such as GDPR or CCPA. With current developments in technology, the amount of human data collected from different tools and environments such as VR/AR has been growing and due to the biometric nature of such data, the tools that provide novel ways of interaction might be seen as privacy-invasive by the public. To address these issues and develop human-centered solutions and regulations, privacy concerns, preferences, and behaviors should be known. In this project, such aspects will be researched especially for immersive learning settings.
For further information write an email to: efe.bozkir@tum.de
Human-centered XAI: Algorithm and User Experience

Explainable AI (XAI) is widely viewed as a sine qua non for ever-expanding AI research. A better understanding of the needs of XAI users, as well as human-centered evaluations of explainable models are both a necessity and a challenge. In this project, we explore how AI and HCI researchers design XAI algorithms to optimize the user experience when interacting with black-box machine learning models. Specifically, we discuss how XAI technology improves different aspects of ML models such as trustworthiness, fairness, usability, etc.
If you are interested in our project, please contact yao.rong@tum.de
Machine learning meets education and VR
Implementing a Virtual Reality Classroom using Unity with Multimodal Data Collection and Generative Models.
The objective of this task is to create a virtual reality classroom using Unity and integrate it with multimodal data collection features such as eye-tracking. Additionally, generative models will be used to create virtual avatars for teachers and students, providing a more personalized and immersive experience for the users.
For further information write an email to mengi.wang@tum.de.
Intelligent Math Tutor
Personalizing education is the way forward, as each child learns at their own pace, possesses unique abilities and may encounter their own set of challenges. In order to provide equal opportunities for all students, it is important to offer easily accessible resources. Our objective is to assist children in learning mathematics by creating an intelligent math tutor that utilizes the latest advances in generative artificial intelligence. This tutor will be friendly, helpful, and patient, with no question considered foolish. Children can ask as many questions as they require and will receive a pleasant response. Our ultimate goal is to personalize the tutor to cater to the individual needs of each child over time.
If you are interessted in working on that project, please contact kathrin.sessler@tum.de.
The Museum Gaze

The use of mobile eye tracking in museums has opened up new possibilities for studying how visitors interact with artworks and perceive them. By capturing eye movements in a natural and uncontrolled setting, we can gain valuable insights into visual attention and behavior. This not only advances the field of art perception but also contributes to the development of eye-tracking technology and computer vision methods.
For further information, write an email to: enkeleda.thaqi@tum.de