Advanced Search
Fu Xinyi, Cai Tianyang, Xue Cheng, Zhang Yuxiang, Xu Yingqing. Research on Body-Gesture Affective Computing Method Based on BGRU-FUS-NN Neural Network[J]. Journal of Computer-Aided Design & Computer Graphics, 2020, 32(7): 1070-1079. DOI: 10.3724/SP.J.1089.2020.18353.z44
Citation: Fu Xinyi, Cai Tianyang, Xue Cheng, Zhang Yuxiang, Xu Yingqing. Research on Body-Gesture Affective Computing Method Based on BGRU-FUS-NN Neural Network[J]. Journal of Computer-Aided Design & Computer Graphics, 2020, 32(7): 1070-1079. DOI: 10.3724/SP.J.1089.2020.18353.z44

Research on Body-Gesture Affective Computing Method Based on BGRU-FUS-NN Neural Network

  • Affective computing research has been a hot research direction in the field of human-computer interaction in recent years.Relevant research is currently focused on facial expressions and speech modalities,and relatively few research on emotion computing based on body gesture modality.In this paper,we proposed a novel body gesture based emotion recognition method,which used virtual reality equipment to stimulate the user’s emotions,used the camera to collect the user’s non-acted motion data,and redefined 19 key points of human movement.We then converted the user’s action data into the 3 D coordinates of the corresponding bone points.Based on the existing basic features,advanced dynamic features were added to construct an 80-dimensional feature list that could more fully describe limb movements.Based on the FUS-NN neural network model,GRU was used instead of LSTM,Layer-Normalization layer and Dropout layer were added,and the number of stacked layers was reduced to propose a BGRU-FUS-NN model.A emotion model based on arousal and valence was used for emotion classification.For the four-class task and the eight-class task,the accuracy rate was improved by 7.22%and 5.15%respectively compared with the FUS-NN model.
  • loading

Catalog

    Turn off MathJax
    Article Contents

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return