Cognition is a very broad term used across various disciplines such as neuroscience, cognitive science, psychology and education. Studies of cognition include processes such as perception, memory, attention, problem solving, decision-making and interpretation that underlie thinking, experiencing and that lead to forms of knowing. Cognitive studies concern the nature of human thought, in particular on how people attend to, process, understand and remember information that others convey. Although much work in cognition seeks to understand ways in which the mind works, two recent approaches increasingly recognize and highlight the central role of bodily experience and interaction with the outside world for cognition. In embodied cognition, cognitive functioning is seen a part of and influenced by the physical body-world interaction (e.g. see ‘Embodiment’ for ‘Embodied Cognition’, Wilson 2002). According to situated cognition, knowledge and learning are acquired situationally and grounded in everyday actions (cf. Lave 1996, Brown et al. 1989, Lave & Wenger 1991).
In the context of multimodality, one strand of cognitive research lies on how different modes of representation or action might influence, or shape, human interaction, attention, perception, interpretation, and meaning making. Research suggests that different modes of representation (visual, audio or tactile), interaction (gestural, physical, visual) and communication change the way that information is perceived, attended to and understood. Another focus of cognitive research lies on users’ active role in the interaction with multimodal messages. This strand investigates general and individual patterns in reception of multimodality, particularly factors influencing the perception and interpretation process, such as users’ interests, attitudes, goals and motives, prior knowledge, experiences, individual preferences, emotions and expertise. Perception and interpretation of multimodal messages is seen as an interactive meeting between the users, the multimodal message and the situation context (Holsanova 2010). The meaning is thus not universal but co-created by the individual recipients by factors affecting their means of integrating various modes of representation.
Cognitive approaches to multimodality include the following application areas:
- Interplay between speech and gestures (Goldin-Meadow 2003)
- Pointing in interaction (Goodwin 2003)
- Users’ interaction with multimodal messages in various media: entry points and reading paths (Holsanova et al. 2006)
- Exploration of users’ visual perception and cognitive processing of multimodal messages (Scheiter & van Gog 2009)
- Principles reducing cognitive load, promoting multimodal integration and supporting efficient learning, cf. Cognitive Load Theory (Chandler & Sweller 1991) and Multimedia Learning Theory (Mayer 2005)
- Procedures for multimodal document design (Bateman 2008)
- Testing design principles, studying the efficiency of multimodal presentations (Holsanova et al. 2009, Holsanova & Nord 2010).
- Examining text-picture integration (Hegarty & Just 1993, Hannus & Hyönä 1999, Holsanova et al. 2009)
- Use of multimodal interfaces (Oviatt 1999).
To study these issues an interdisciplinary approach and use of a combination of methods is essential (Holsanova 2012). Eye tracking is one methodology that provides data concerning perceptual and cognitive processes through giving insights into the allocation of visual attention (which elements are attended to, for how long, in what order and how carefully) (Holsanova et al., 2006). However, as eye movement protocol does not tell us about recipients’ understanding of the messages triangulation of methods (knowledge test, comprehension tests, retrospective or concurrent verbal protocols, interviews) which complement eye tracking is important.
Editor: Sara Price
Other contributor: Jana Holsanova
Key References
Holsanova, J. (2008)
Discourse, vision, and cognition
Amsterdam/Philadelphia: Benjamins
Price, S. & Pontual Falcao, T. (2011)
Where the attention is: Discovery learning in novel tangible environments
Interacting with Computers, 23, (5) p. 499-512
Scheiter, K. & van Gog, T. (2009)
Using Eye Tracking in Applied Research to Study and Stimulate the Processing of Information from Multi-representational Sources
Appl. Cognit. Psychol. 23: 1209–1214