高级检索
钟习, 陈益强, 于汉超, 杨晓东, 胡子昂. 融合情境感知信息的超声波手势识别方法[J]. 计算机辅助设计与图形学学报, 2018, 30(1): 173-179. DOI: 10.3724/SP.J.1089.2018.16176
引用本文: 钟习, 陈益强, 于汉超, 杨晓东, 胡子昂. 融合情境感知信息的超声波手势识别方法[J]. 计算机辅助设计与图形学学报, 2018, 30(1): 173-179. DOI: 10.3724/SP.J.1089.2018.16176
Zhong Xi, Chen Yiqiang, Yu Hanchao, Yang Xiaodong, Hu Ziang. Context-Aware Information Based Ultrasonic Gesture Recognition Method[J]. Journal of Computer-Aided Design & Computer Graphics, 2018, 30(1): 173-179. DOI: 10.3724/SP.J.1089.2018.16176
Citation: Zhong Xi, Chen Yiqiang, Yu Hanchao, Yang Xiaodong, Hu Ziang. Context-Aware Information Based Ultrasonic Gesture Recognition Method[J]. Journal of Computer-Aided Design & Computer Graphics, 2018, 30(1): 173-179. DOI: 10.3724/SP.J.1089.2018.16176

融合情境感知信息的超声波手势识别方法

Context-Aware Information Based Ultrasonic Gesture Recognition Method

  • 摘要: 针对现有的超声波手势识别方法易受用户误操作手势的影响,难以对识别错误的手势进行实时修正等问题,提出一种融合情境感知信息的手势识别方法.首先通过对手势信号进行时频分析提取有效的手势特征,构建基于超限学习机算法的手势识别模型,并利用softmax函数将手势识别结果映射为手势的置信度;然后通过自定义的概率转化函数将情境信息转化为手势的情境置信度;最后融合手势的置信度和情境置信度,以利用情境感知结果过滤用户的误操作手势,修正识别错误的手势,输出符合用户意图的手势识别结果.将文中方法应用于超声波手势识别的实验结果表明,该方法的识别准确率能够达到94.7%,比无情境信息的超声波手势识别方法提高33.2%.

     

    Abstract: The existing ultrasonic gesture recognition methods are usually vulnerable to invalid gestures and hardly to identify wrong classified gestures in real time.This paper presents a context-aware information based ultrasonic gesture recognition method.This method extracts effective gesture features using fast Fourier transform.Then the confidence of gestures is calculated by using extreme learning machine algorithm and softmax function.Context-aware information is transformed into gesture’s context confidence by the defined probability transformation function simultaneously.Both gesture confidence and gesture’s context confidence are combined to yield satisfactory recognition results by filtering invalid gestures and correcting wrong classified gestures finally.The results of extensive experiments show that the recognition accuracy of this method could reach94.7%,which is33.2%higher than the ultrasonic gesture recognition methods without context information.

     

/

返回文章
返回