
I’m an Associate Professor and Sapere Aude Research Leader in Human-Computer Interaction (HCI) at Aarhus University in Denmark. I lead the XI team, where we explore how we can render technology more natural to use – especially through multimodal, spatial, and AI-driven techniques in emerging computer devices (mobiles, wearables, AR/VR, mobiles).
Our research is funded by AUFF, Danish Pioneer Center of AI, and DFF Sapere Aude.
📚 Publications: Google Scholar
🏢 Previous Affiliations: Usable Security and Privacy, Lancaster University, Microsoft, Google.
✉️ Contact: ken [at] cs [dot] au [dot] dk
Research Vision
Today, we mostly control computers with a mouse, keyboard, or touchscreen in graphical user interfaces (GUI), tools that have not evolved much in decades. But us humans have much more to give than simple finger taps on flat, retangular buttons. What if interfaces could understand what you are thinking, feeling, where you’re looking—and exploit this to transform computer interaction for a more intuitive and seamless experience?
As example, much of my prior work focused on the integration of our eye movements in the computer interface. Through my vision of an Eye-Hand Symbiosis, I have established the scientific foundations for eyes+hands computer control in various contexts.

This approach lets users interact with any object they see—not just where their finger is. For example:
- Gaze + Touch: Touch anywhere while looking at a target- no need to aim by hand.
- Gaze + Pinch: With your bare hands only, look at an object and pinch your fingers to select it.
- Gaze-Shifting: Use the eye direction to choose direct or indirect gestures types on-demand.
Input devices such as a touchscreen or hand-tracking, typically involve a (1:1) mapping between the user’s hand input position and the manipulated object, to manipulate one object that coincides with your hand’s position in space. In contrast, eye+hand interfaces enable control any object you see, from any hand position (N:N).

These methods are already being adopted in next-generation XR devices like Apple Vision Pro1,2. In principle, Eye-hand Symbiosis could transform how we interact with all kinds of digital devices, and their potential is subject to further research.

The Eye-Hand Symbiosis research project covers a range of research efforts since 2013, as illustrated below. See the drop-down list in the homepage menu to dive into specific articles that were published over the years in scientific venues.

Since 2023, I have written popular science articles aimed at a broader audience, for a high-level overviews of various scientific articles and reflections on the past/ongoing evolution.
- Eyes & hands in AR: A sci-fi-inspired mobile UI research (Oct 2023, Medium, LinkedIn)
- Design Principles & Issues for Gaze and Pinch Interaction (Jan 2024, Medium, LinkedIn)
- History of Eyes and Hands for Computer Control (Mar 2024, Medium, LinkedIn)
- The Role of Eyes and Hands in the Evolution of Computer Interfaces (Nov 2024, Future of Text)
- Blending Direct and Indirect Interaction: A Concept for Seamless Computer Interfaces (Dec 2024, Medium, LinkedIn)
Talks (on Video)
See full list of talks here.
Evolution of XR Input, Future of Text, May 19, 2025
How to design for gaze + hand tracking, XR Design Community, 9th March 2024
Gaze + Pinch Interaction in Virtual Reality, ACM Symposium on Spatial User Interaction 2017 (SUI’17)
Gaze and Touch Interaction on Tablets, ACM Symposium on User Interface Software and Technology 2016 (UIST’16)
Partially-indirect Bimanual Input with Gaze, Pen, and Touch for Pan, Zoom, and Ink Interaction, ACM SIGCHI Conference on Human Factors in Computing Systems (CHI ’16)
Extended Interaction (XI) Team
We are a multidisciplinary team at Aarhus University exploring how people interact with digital worlds, through studying the synergy of fundamental human body modalities (like body, arm, hand, finger, gaze) and digital tools (applications, interface types, UI objects, interaction tasks).
Please reach out for potential internships, bachelor/master thesis, research stays, collaborations and more!
| Name | Role | Research Focus |
|---|---|---|
| Ken Pfeuffer | Associate Professor | HCI, XR, UI Design, Eye-tracking (Team Lead) |
| Hans Gellersen | Professor | HCI, XR, Eye-tracking, Ubicomp |
| Jens Emil Grønbæk | Assistant Professor | Collaborative XR |
| Qiushi Zhou | Postdoctoral Researcher | HCI in Extended Reality |
| Mathias N. Lystbæk | PhD student | Visual Attention in AR for Manufacturing |
| Pavel Manakhov | PhD student | Mobile Spatial Interaction in XR |
| Christoph Johns | PhD student | Adaptive Interaction & Multi-objective Optimization |
| Juan Sánchez Esquivel | PhD student | Spatial Interaction for XR |
| Thorbjørn Mikkelsen | PhD student | Eye-hand Manipulation in 3D |
| Uta Wagner | PhD student (Alumni) | Gaze-based 3D Interaction – now Postdoc at University of Konstanz |