Eyes on Multimodal Interaction

In a conversation, much can be sensed from the person’s eye gaze—interested or uninterested, attentive or preoccupied, focused or distracted, engaged or unmindful, wanting to continue or trying to get away etc. With the advance of new eye tracking technology it might be possible to use eye-gaze information in conversations with computers. The research presented in this talk firstly investigates if there are any eye-gaze patterns present in natural dialogues that can be detected and used by a multimodal interactive system, secondly, describes a computer system that embodies the eye-gaze patterns found in natural dialogues. The results demonstrated that eye-gaze can play an assistive role in managing future multimodal human-computer dialogues.

Speaker Bios

Pernilla Qvarfordt recently her Ph. D. in Computer Science at Linköping University, Sweden. She previously have a M.A. in Cognitive Science from the same university. Her disseration work cross over the research fields of human-computer interaction, cognitive psychology and linguistics. During her graduate studies she visited Université Paris-Sud and IBM Almaden Research Center as a vising scientist. At Linköping University Pernilla have been active in course developement and teaching in the areas of human-computer interaction and interaction design.

Date:
Haut-parleurs:
Pernilla Qvarfordt
Affiliation:
Linköping University, Sweden