A Study of the Multimodal Cognition and Interaction Based on Touch, Audition and Vision
-
Graphical Abstract
-
Abstract
We propose a cognition model on how the information of touch, audition and vision is integrated in human mind, and a layered processing model on how the computers process the multimodal information.The proposed cognition model is based on the classic cognition model of multimedia learning, and the ideas of the latest cognition theories.The proposed layered processing model separates the processing route of the interface from the main function in the application, as well as the interactive device from the information processing.The separation avoids the redundant work caused by high coupling, and simplifies the work of interface study from different perspectives.The simplification helps realizing a simple version of a multimodal interaction sample integrating touch, audition and vision.The experimental results show that the model proposed could improve the performance in interaction studies.
-
-