XR engineer and PhD student at the University of Cambridge specialising in Social VR and embodied conversational agents. I build intelligent, full-stack XR systems that turn novel human–AI interaction ideas into working immersive experiences.
Building immersive experiences that push the boundaries of interactive technology
A social VR prototype combining large language models with embodied avatars, deployed on mobile hardware to explore real-time, in-headset interaction.
View case study
A stereoscopic super-resolution pipeline for head‑mounted displays, fusing RGB, depth, and motion vectors to upscale XR scenes while keeping performance viable. I especially focused on generating the training dataset, working with scene RGB, depth, and motion vector information.
GitHub Source
The engine behind my redirected walking research: a non‑Euclidean museum environment that lets users walk infinitely in a finite tracked space.
GitHub SourceAdvancing the field of Human-Computer Interaction through rigorous research
In press · CHI 2026 Extended Abstracts (Poster)
Exploring how personalised, face-tracked avatars shape identity expression and social presence in social VR.
Virtual Worlds 4 (3), 39
ACM CHI PLAY 2024
IVCNZ 2023 · First author