Egocentric Interaction

What of all this is part of the interactive system, here and now?
As computing hardware becomes smaller, cheaper, and more aware of the surrounding physical environment, new ways of interacting with computer systems are enabled. Real world properties including everyday objects etc. increasingly become part of interactive systems that span many devices and embedded systems which, to some degree, are included and dropped on the fly as the activity performed by the human agents evolve.

How can the system and human agent best communicate in a given situation?
By knowing something about the cognitive and perceptive state of a human agent, the system could (in theory) adapt its communication in time, space, and modality to minimize load and interruption costs.

“Our future is prosthetic: a world of nuanced feedback and control through enhanced interaction” (Kirsh, 2013).

The egocentric interaction design and analysis approach adresses questions such as the ones above in a body and mind centric way. As such, it is a reaction to the device-centric (exocentric) approaches typically taken in the Ubiquitous Computing community. It is a futuristic design approach that assumes that many existing interoperation challenges and real-world sensing challenges will eventually be solved 5, 10 or 15 years from now, and personal computer systems carried by human agents will be able to operate at a higher level of abstraction than what is possible today so that computation and interaction capabilities can be easily shared among co-located devices. In fact, the idea is that these personal egocentric interaction systems will be able to work cognitively and perceptually much more intimately with their users, knowing what they are attending to and to what degree, so that computation power can be accessed with significantly less human effort than what current personal computing systems demand from us in order to do their job. They might become as natural for us to wear and make use of as are our clothes and we will feel naked without them.

We aim to help defining this new egocentric interaction paradigm (Pederson et al., 2010) which likely will complement the existing WIMP paradigm for desktop/laptop computers, the maturing small-scale multitouch paradigm for handheld devices, and the still to settle emerging interaction paradigm for full-body interaction with for instance large visual displays.

In the PIT Lab, we are currently exploring the use of new mobile sensor and actuation technology (e.g. gaze tracking, object tracking, visual display technology) for determining and affecting what human agents can see and act on while moving about in real world environments. We use the Situative Space Model (Pederson et al., 2011) as a conceptual tool for this purpose.

In the IxD Lab, we are investigating ways to subliminally and supraliminally induce low-effort body actions such as eye and head movements in order to affect the structure of the spaces in the Situative Space Model for a given individual in a given situation. Possible applications could be future guide systems that guide human agents consciously or uncosciously in the right direction or make them press the “right” switch on a machine even if prior knowledge is lacking.

References

      Kirsh, D. (2013). Embodied cognition and the magical future of interaction design. ACM Trans. Comput.-Hum. Interact. 20, 1, Article 3 (April 2013), 30 pages. DOI=10.1145/2442106.2442109

http://doi.acm.org/10.1145/2442106.2442109

      Pederson, T., Janlert, L.-E., Surie, D. (2011). A Situative Space Model for Mobile Mixed-Reality Computing. IEEE Pervasive Computing Magazine, vol. 10, no. 4, pp. 73-83, Oct. 2011.

http://dx.doi.org/10.1109/MPRV.2010.51

      Pederson, T., Janlert, L. -E., Surie, D. (2010). Towards a model for egocentric interaction with physical and virtual objects. In Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries (NordiCHI ’10). ACM, New York, NY, USA, 755-758. DOI=10.1145/1868914.1869022

http://doi.acm.org/10.1145/1868914.1869022