The Handbook of Multimodal-Multisensor Interfaces, Volume 1. Sharon Oviatt
Читать онлайн книгу.perception of synchrony. Current Biology 13:R519–R521. DOI: 10.1016/S0960-9822(03)00445-7 24
B. E. Stein, editor. 2012. The New Handbook of Multisensory Processing, 2nd ed. MIT Press, Cambridge, MA. DOI: 10.2174/2213385203999150305104442. 20
B. E. Stein and M. Meredith. 1993. The Merging of the Senses. MIT Press, Cambridge, MA. 20, 21
J. Sweller. 1988. Cognitive load during problem solving: Effects on learning. Cognitive Science, 12(2):257–285. 32
S. Tindall-Ford, P. Chandler, and P. Sweller. 1997. When two sensory modes are better than one. Journal of Experimental Psychology Applied, 3(4):257–287. DOI: 10.1037/1076-898X.3.4.257. 32
J. van Merrienboer and J. Sweller. 2005. Cognitive load theory and complex learning: Recent developments and future directions. Educational Psychology Review, 17(2):147–177. DOI: 10.1007/s10648-005-3951-0. 32
F. Varela, E. Thompson, and E. Rosch. 1991. The Embodied Mind: Cognitive Science and Human Experience. The MIT Press, Cambridge, MA. 35
L. Vygotsky. 1962. Thought and Language. MIT Press, Cambridge, MA (Translated by E. Hanfmann and G. Vakar from 1934 original). 33, 34
L. Vygotsky. 1978. Mind in Society: The Development of Higher Psychological Processes. M. Cole, V. John-Steiner, S. Scribner, and E. Souberman, editors. Harvard University Press, Cambridge, MA. 33
L. Vygotsky. 1987. The Collected Works of L. S. Vygotsky, Volume I: Problems of General Psychology, Edited and translated by N. Minick. Plenum, New York. 33
N. Waugh and D. Norman. 1965. Primary memory. Psychological Review72:89–104. 30
J. Welkowitz, G. Cariffe, and S. Feldstein. 1976. Conversational congruence as a criterion of socialization in children. Child Development 47:269–272. 37
M. Wertheimer. 1938. Laws of organization of perceptual forms. In W. Ellis, editor, translation published in A Sourcebook of Gestalt Psychology. pp. 71–88, Routledge and Kegan Paul, London. 21
C. Wickens, D. Sandry, and M. Vidulich. 1983. Compatibility and resource competition between modalities of input, central processing, and output. Human Factors, 25(2):227–248. 31
C. Wickens. 2002. Multiple resources and performance prediction. Theoretical Issues in Ergonomic Science, 3(2):159–17. DOI: 10.1518/001872008X288394. 31
B. Xiao, C. Girand, and S. L. Oviatt. 2002. Multimodal integration patterns in children. In Proceedings of the International Conference on Spoken Language Processing, pp. 629–632. DOI: 10.1145/958432.958480. 26
B. Xiao, R. Lunsford, R. Coulston, M. Wesson, and S. L. Oviatt. 2003. Modeling multimodal integration patterns and performance in seniors: Toward adaptive processing of individual differences. Fifth International Conference on Multimodal Interfaces [ICMI], ACM, Vancouver. DOI: 10.1145/958432.958480. 33
J. Zhang and V. Patel. 2006. Distributed cognition, representation, and affordance. In I. Dror and S. Harnad, editors. Cognition Distributed: How Cognitive Technology Extends Our Mind. pp. 137–144. John Benjamins, Amsterdam. DOI: 10.1075/pc.14.2.12zha. 39
G. Yang, F. Pan, and W. B. Gan. 2009. Stably maintained dendritic spines are associated with lifelong memories. Nature, 462:920–924. DOI: 10.1038/nature08577. 34
Zhou, J., Yu, K., Chen, F., Wang, Y. and Arshad, S. 2017. Multimodal behavioral and physiological signals as indicators of cognitive load. S. Oviatt, B. Schuller, P. Cohen, D. Sonntag, G. Potamianos and A. Krüger, editors, The Handbook of Multimodal-Multisensor Interfaces, Volume 2: Signal Processing, Architectures, and Detection of Emotion and Cognition Morgan Claypool Publishers, San Rafael, CA. 29
E. Zoltan-Ford. 1991. How to get people to say and type what computers can understand. International Journal of Man-Machine Studies, 34:527–547. DOI: 10.1016/0020-7373(91)90034-5. 37, 38
1. Approximately a 250 ms lag is required between speech and corresponding lip movements before asynchrony is perceived.
2
The Impact of Multimodal-Multisensory Learning on Human Performance and Brain Activation Patterns
Karin H. James, Sophia Vinci-Booher, Felipe Munoz-Rubke
2.1 Introduction
The human brain is inherently a multimodal-multisensory dynamic learning system. All information that is processed by the brain must first be encoded through sensory systems and this sensory input can only be attained through motor movement. Although each sensory modality processes different signals from the environment in qualitatively different ways (e.g., sound waves, light waves, pressure, etc.), these signals are transduced into a common language in the brain. The signals are then associated and combined to produce our phenomenology of a coherent world. Therefore, the brain processes a seemingly unlimited amount of multisensory information for the purpose of interacting with the world. This interaction with the world, through the body, is multimodal. The body allows one to affect the environment through multiple motor movements (hand movements, locomotion, speech, gestures, etc.). These various actions, in turn, shape the multisensory input that the brain will subsequently receive. The feedforward-feedback loop that occurs every millisecond among sensory and motor systems is a reflection of these multisensory and multimodal interactions among the brain, body, and environment. As an aid to comprehension, readers are referred to this chapter’s Focus Questions and to the Glossary for a definition of terminology.
In the following, we begin by delving deeper into how sensory signals are transduced in the brain and how multimodal activity shapes signal processing. We then provide samples of research that have demonstrated that multimodal interactions with the world, through action, facilitate learning. An overview of research on performance measured by overt behavioral responses in adult and developing populations is followed by examples of research on the effects that multimodal learning has on brain plasticity in adults and children. Together, the behavioral and neuroimaging literature underscore the importance of learning through multimodal-multisensory interactions throughout human development.
2.2 The Multimodal-Multisensory Body
The ultimate utility of a sensory mechanism is to convey information to an organism in the service of eliciting environmentally appropriate action. An interesting question arises in consideration of the inherently multisensory nature of behavior:
How is it that the human perceptual system provides us with seamless experiences of objects and events in our environment?
The difficulty in answering this question lies in one’s conception of the role of the human perceptual system. Approaching this question as a constructivist would lead to a major impasse: How it is that the brain is able to infer meaning from sensory input and translate among sensory modalities, given that these signals have little fidelity to the environmental stimulus by which they were evoked? Further, how are the signals combined given that the signal of one sense is not directly comparable to the signal of another? This impasse is referred to as the binding problem and is a logical outcome of a constructivist approach to the role of the human perceptual system [Bahrick and Lickliter 2002]. If each sensory modality is transduced into a unique neuronal firing pattern, then the only way to infer the appropriate response to that particular set of sensory input is to effectively combine them into a unified percept. On the other hand, recent theories of perceptual development portray the human perceptual system as a multimodal system that responds to unisensory and multisensory inputs with differential weighting on modality-specific stimulus properties and amodal stimulus properties, respectively [Stein and Rowland 2011, Stein et al. 2014]. Formally, this theory is referred to as intersensory redundancy (see Figure 2.1).
Intersensory redundancy is based upon the observation that an organism and its environment are structured