Multimodal interaction with wearable computers
As computing devices become smaller, people are increasingly carrying and using them while moving. That is the reason why we are witnessing the growing attention to the wearable devices such as eyewear computers and smart watches and it seems feasible that eventually these unobtrusive wearable devices play role in everyday tasks. However, most mobile systems do not support interaction in motion, and users need to stop to interact with their mobile devices. One of the main reasons, which makes user stop and interact with a mobile device is the fact that most of the mobile devices provide static channels for interaction with user. Touchscreen is the main channel of interaction between user and mobile devices. To perceive the information displayed on the touchscreen, users need to devote their visual attention to the mobile devices which means that they need to share their visual perception between the real world and the mobile devices. Also to provide input to a mobile device through touchscreen, at least one hand needs to be dedicated for holding and touching the mobile device. Apart from physical and perceptual challenges of interaction in motion, there are some other aspects associated with mobile interaction such as cognitive load. Even if mobile users do not need to dedicate their visual attention to their mobile devices, interaction with computing devices through other channels such as auditory modality can still raise cognitive problems.
Supporting users to interact with wearable devices through several modalities such as speech, gesture, and gaze (multimodal interaction) in different situations can decrease the perception and cognition problems of mobile interaction. The main idea with this master thesis is to design an interaction manager middleware to support multimodal interaction with wearable devices. The middleware is a wearable platform that adapts the multimodal UI in different situations based on contextual cues. A model driven approach can be used to model the UI elements. A first architecture of the interaction manager middleware is proposed in a previous master thesis (http://pitlab.itu.dk/sites/default/files/reports/MS-thesis_1.pdf)