Research

My research sits at the intersection of Human-Computer Interaction (HCI) and Applied Machine Learning, guided by two core questions: how can user interfaces be augmented with intelligence to support interaction? and what insights about human behavior can be derived from interaction and sensor data? To address these questions, my work spans multimodal and implicit interaction, intelligent interfaces, reading and note-taking in digital environments, physiological user modeling, and creativity support tools — with the goal of designing systems that are both technically robust and human-centered.


How can user interfaces be augmented with intelligence to support interaction?

Gaze & Multimodal Interaction

Eye tracking · Gaze input · Voice interaction · Head movement · Multimodal HCI

I investigate how multiple input modalities—such as gaze and speech—can be combined to enable more natural and expressive interaction. My work focuses on implicit and multimodal interaction, where systems infer user intent from natural behavior rather than relying solely on explicit input.

Key publications


Reading, Note-Taking & Digital Learning

Digital reading · Note-taking · Learning technologies · Comprehension

This work investigates how input modality, task difficulty, and interaction design shape learning and knowledge retention in digital environments, with the goal of designing tools that actively support learning rather than simply digitize existing workflows.

Key publications


Creativity Support Tools

XR · Creativity support · Intelligent tools · Human–AI collaboration

This work explores how intelligent systems can support creativity and expressive work, particularly in immersive environments. It focuses on designing tools that augment ideation, collaboration, and creative workflows using AI and XR technologies, extending my prior work in interaction and user modeling toward systems that enhance human expression.

Selected work


What insights about human behavior can be derived from interaction and sensor data?

Eye–Head Dynamics in XR

Virtual reality · Eye–head coordination · Hands-free interaction · Attention modeling

This research investigates how eye and head movement coordinate dynamically and how this coordination can be leveraged to design efficient, hands-free interaction techniques. This work introduces methods for distinguishing gaze and head intent, modeling attention shifts, and enabling adaptive interaction techniques that respond to user behavior in real time.

Key publications


Physiological & Sensor-Based User Modeling

Thermal imaging · EDA · Attention · Cognitive modeling

This work explores how physiological and behavioral signals can be used to infer users’ identity, cognitive, and affective states. It leverages modalities such as eye tracking, electrodermal activity (EDA), and thermal imaging to detect attention, cognitive load, and mind wandering. The goal is to enable adaptive, context-aware systems that respond intelligently to users’ internal states and support more effective interaction.

Key publications


Full Publications

For a complete list of publications, see my Google Scholar.