基于表面肌电和加速度信号融合的动作识别和人体行为分析研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
以人为中心的智能化人机交互一方面要求计算机具有自动检测、分析和理解人类更多的姿势、行为动作、生理心理状态、语言、情感和触觉等自然能力;另一方面要求其应用和服务具备感知周围情境信息的能力,并且能够根据感知情境的变化提供相应的服务。基于人体行为动作识别技术的非键盘输入方式既适用于微型化可移动设备使用环境和双手无法空闲的交互场合,又适用于聋哑人与正常人交流的手语识别系统和其他新颖的人机交互技术研究平台。同时,作为情境感知中的重要因素之一,行为感知在移动健康监护具有重要研究意义,对空巢独居老年人或慢性疾病患者提供日常行为监测服务,发现异常情况及时报警并采取救援行动,这对保障老年人和患者的生活很有裨益。
     本文以精细手指按键动作、手语手势动作以及下肢步态动作为研究对象,对基于表面肌电(Surface electromyogranphy, SEMG)信号或/和加速度(Acceleration, ACC)信号的多类行为动作的感知与识别进行了深入研究,并开展了一定规模的用户测试实验。其中,基于按键动作识别和虚拟键盘模拟手机交互平台的实现可促进智能化人机交互接口技术的发展和应用推广;融合SEMG和ACC信号的下肢动态步态动作的识别研究一方面提高了步态行为动作识别的准确性,另一方面将人体行为动作SEMG识别技术推广到智能化的情境感知应用领域,对指导人类行为理解、康复医学工程等领域的研究具有重要意义;而中国普乐手语手势动作的识别研究实现了较小训练负担下中国汉字较高识别结果,为聋人与健听人的交流提供了强有力的桥梁,其研究成果将直接造福于聋人群体。本文主要的研究工作和创新点包括:
     1.基于SEMG信号手指按键动作识别和虚拟键盘交互实现。此研究旨在探索实现随时随地“无形”键盘输入方式的可行性。主要研究工作如下:1)以右手16类手指按键动作和4类控制动作为研究对象,对包含信号采集、活动段分割、特征提取和分类识别在内的手势动作SEMG信号识别方法进行了研究,提出了一种适用于实时交互平台的算法;2)结合神经肌肉控制生理学知识确定了多通道SEMG电极安放位置;3)构建了基于手势动作识别的虚拟键盘进行模拟手机交互,并开展用户调查实验。基于多类手势动作识别的虚拟键盘交互技术,其手势动作的平均识别率可达94%,且用户经过适当的动作训练后可实现任意平面手势动作SEMG无形虚拟键盘的“随身携带”。同时,用户调查实验结果显示了该交互方式具有一定的新颖性,是一种用户可完全接受的人机交互方式。
     2.融合SEMG和ACC信号的人体日常行为动作感知和跌倒检测研究。此研究的目标是通过对用户日常行为动作感知与识别,实现独居老年人或者慢性疾病患者的健康监护,保障其生活质量。主要研究工作包括:1)从健康监护平台的实时性和低计算复杂度要求出发,引入身体姿态的概念,将日常行为动作分解为静态行为动作、各静态行为动作转换形成的动态转移动作,以及一类特殊的由“站”到“站”产生的步态行为动作。2)提出直方图负熵的概念,利用熵表征时间序列不确定度和复杂度的能力,实现了基于ACC信号直方图熵的静态动作和动态动作活动段分割。对于静态动作活动段信号,采用多级夹角阂值决策算法实现了不同身体姿态的识别,对于动态行为动作,结合动态动作活动段的前后身体姿态变化信息将其成功识别为动态转移动作和步态行为动作。3)采用SEMG和ACC信号融合的双流HMMs进行了特定步态行为动作模式识别,同时结合身体姿态变化信息和合加速度幅值阈值信息实现了正常转移动作和跌倒动作的判断。4)设计连续日常行为动作和跌倒动作数据采集实验,验证了该方法进行日常行为感知和跌倒检测的有效性。在行为感知和识别中引入身体姿态信息,有效节约了分类识别系统的计算资源开销;前后身体姿态变化信息和活动段内合加速度幅值阈值相结合的方法提高了跌倒检测的准确性,为未来独居老年人和慢性疾病患者的健康监护提供保障方案奠定了研究基础;融合SEMG和ACC信号进行步态行为动作分类识别研究,一方面提高了步态行为动作的识别率,另一方面将下肢行为动作SEMG引入情境感知领域的行为感知应用中,开启了情境感知领域中行为感知的应用思路。
     3.融合SEMG和ACC信号的词汇量可扩展的连续中国普乐手语识别研究。此研究旨在采用较小负担的用户训练样本,实现词汇量可扩展的中国普乐手语识别方法。这部分的研究工作主要包括:1)采用要素概念,充分利用SEMG在检测精细手形动作方面的优势和ACC在检测大尺度的运动轨迹方面的优点,创新性地提出了融合SEMG和ACC信号中国普乐手语手势动作执行方案。该方案涉及的手形和运动轨迹要素其规模较小且数目恒定,并不会随着中国汉字词汇量的扩展而发生变化,从而保障了较小的用户训练负担。2)针对连续手势动作相对于孤立手势动作活动段分割中存在的两种信噪比低的情况,提出模糊熵算法,实现了连续手势动作活动段的有效分割。3)分别采用各要素分类器并加入决策融合机制,对由223个中国汉字构成的504个日常情境句子进行分类识别。实验结果表明,提出的模糊熵算法可以有效解决连续手势动作活动段分割问题,确定要素分类器的先后顺序可以进一步降低用户训练负担,其决策融合机制在一定程度上减少了中国汉字识别的传递误差。该手语识别方法实现了较小训练负担基础上中国汉字较高的识别结果,为中国手语手势动作识别提供了一种补充形式,为连续手语识别系统的应用推广提供了新思路。
     本论文的研究得到了国家863高科技研究发展计划“基于肌电传感器和加速计的手势交互设备研究”(2009AA012322)、中央高校基本科研业务费专项资金“基于情境感知的多源信息分析和理解”(WK2100230002)、Nokia芬兰赫尔辛基研究中心和北京研究院合作项目,以及中国科学技术大学研究生创新基金资助。
One the one hand, the human-centered intelligent human-computer-interaction (HCI) requires a computer to be capable of automatically detecting, analyzing and understanding the natural abilities of human posture, behavioral action, physiological and psychological status, language, emotions and tactile sensation, etc. On the other hand, it demands services and applications to perceive the surrounding contextual information and then provide the necessary services according to the changes in the perception of contexts. Non-keyboard input mode based on human behavior recognition technology applies to not only the miniaturization environment and both hands occupied occasions, but also the sign language recognition system for the communication between the deaf and the health and other innovative HCI techniques research platform. Meanwhile, as one of the most important factors in context awareness, the awareness of behaviors and activities is of importance to mobile health care applications. Providing daily behaviors awareness services to the empty-nest elderly and patients with chronic diseases is good for promoting their livings due to abnormal situation timely alarm and rescue action rapidly supply.
     This thesis assessed fine finger keystrokes, sign language gestures, as well as lower limb gait activities to conduct in-depth research on multi-class gesture and behavior awareness and classification using SEMG (Surface Electromyography, SEMG) or/and ACC (Acceleration, ACC) signals and carried out a certain scale user testing experiments. Firstly, the virtual keyboard interacted with a simulated phone platform based on key-press gestures using SEMG signals was realized to make contributions to the development and promotion of intelligent HCl interface technology. Secondly, the research conducted on the identification of lower limb gait activities using both SEMG and ACC signals will not only improve the classification accuracies of gait activities, but also extend the behavior SEMG signals recognition technique to the intelligent context awareness applications. Becides, it will provide guidance for human behavior understanding and rehabilitation engineering works. Finally, the research conducted on Chinese Pule sign language gesture recognition interpreted from SEMG and ACC signals achived well recognition results based on a small user training sets, and it will directly benefit the deaf community from establishing a strong communication bridge between the deaf and the health. The main research work and innovation points are as follows:
     a. Research on key-press gestures recognition and virtual keyboard interaction using SEMG signals. This study aimed at exploring the feasibility of invisible keyboard input style at anytime and anywhere. The main work are as follows:a) The SEMG processing and recognition algorithms including active segmentation, feature extraction and classifiers design were studied to classify totally16key-press gestures and4control gestures acquired from the right forearm. These algorithms were proposed for real-time interaction. b) The physiological knowledge of neuromuscular control was employed to confirm the multi-channel SEMG electrodes placement. c) The virtual keyboard was constructed and the simulated phone interaction platform was designed, as well as the user survey experiments were conducted. The results demonstrated that the averaged classification accuracy of all the interaction gestures was94%. After sufficient training, the visiable SEMG-based virtual keyboard could be realized and be carried by anytime. Meanwhile, the survey findings showed that this HCI style was novel and acceptable.
     b. Research on daily activity awareness and fall detection based on the fusion of SEMG and ACC signals. The goal of this part was to provide health care services for empty-nest elderly and patients with chronic diseases, with the purpose of promoting their living quality, by means of perceiving their daily activities. The studies include: a) Meeting real-time and low computational complexity requirements, body posture concept was introduced to separate daily activities into static activities and dynamic activities. The dynamic activities would further be classified as dynamic transition activities and dynamic gait ones by judging whether both the pre-and post-body postures were standing. b) Histogram minus entropy was proposed to segment the active static and dynamic segments. As for active static segments, multi-level angle thresholds algorithm was used for different body posture identification, which further assisted with sorting dynamic activities into dynamic gait activities and dynamic transition ones. c) Double-stream HMMs classifier using both SEMG and ACC signals was conducted for specific gait activity category classification. Body posture changing information and resultant acceleration threshold information were combined to distinguish normal dynamic transition actions and abnormal fall events. d) A continuous period of activities acquisition experiment was designed to demonstrate the performance of the proposed framework. The outcomes proved that the framework effectively save the computing resources due to body posture information and the combination scheme of body posture changing and resultant ACC threshold information improved the accuracies of fall detection, which laid the foundation researches for providing security solutions for the healthcare of elderly and patients. Meanwhile, on one aspect, the research on the gait dyanamic activity recognition based on the fusion of SEMG and ACC signals, improved the gait activity classification accuracies. On the other aspect, it opened the idea of SEMG-based activity awareness during context awareness applications,
     c. A novel phonology-and component-code based framework for Chinese sign language recognition using SEMG and ACC sensors. The purpose of this study was to proposal a vocabulary scalable Chinese SLR system recognition scheme, without increasing the training burden. This part of studies include:a) The innovative sign gesture execution scheme using SEMG and ACC signals was presented, taking both advantages of SEMG signals in hand-shape elements characterization and ACC signals in movement elements characterization. The scale of the hand-shape and the movement elements was small, and the number of them was constant even if the Chinese vocabulary scale is large, which ensured relative low user training burden. b) Due to its well ability of complexity and uncertainty characterization of various time series dynamic system, the fuzzy entropy was proposed for segmenting continuous sign gestures active segments during a relative low SNR (Signal-to-Noise Ratio, SNR) situation, compared to isolated gesture segments. c) Each element feature vector was firstly extracted and then classified by corresponding classifier and then fused during decision-level. Classification tasks were conducted on504daily situation sentences composed by223characters. Determining the sequence of element classifiers could further reduce the user training burden and adopting decision-fusion scheme helped to reduce the transmission error of Chinese character recognition. By the proposed scheme, the Chinese character could be well recognized based on a small user training sets, which helped to provide a supplement form of Chinese Sign Language. Becides, it provided new idea for the application promotion of continuous SLR system.
     The research is supported by the National High Technology Research and Development Program of China (The863program)"Research on the Gesture Input Devices Based on Accelerometers and Surface EMG sensors"(2009AA01Z322), Fundamental Research Funds for the Central Universities of China under Grand No. WK2100230005, cooperation projects with Nokia Research Center (Helsinki& Beijing), and Graduate Innovation Foundation of University of Science and Technology of China.
引文
蔡和平.2010.老年跌倒的相关因素和预防对策[J].健康必读(下旬刊),第6期
    陈海英.2009.AR模型功率谱估计常用算法的性能比较田.漳州师范学院学报(自然科学版)(1).48-52
    陈伟婷.2008.基于熵的表面肌电信号特征提取研究[D].上海:上海交通大学
    何乐生.2006.基于肌电信号的人机接口技术的研究[D].南京:东南大学
    李冬,梁山.基于加速度传感器的老年人跌倒检测装置设计.传感器与微系统v27(9).85-88
    李强.2008.表面肌电信号的运动单位动作电位检测[D].合肥:中国科学技术大学
    汤晓芙.2002.神经病学-神经系统.临床电生理学(下)(肌电图学及其他)[M].北京:人民军医出版社.
    胡广书.2003.数字信号处理理论、算法与实现[M].北京:清华大学
    孙即祥.2001.现代模式识别[M].长沙:国防科技大学出版社
    孙新香.2008.基于三轴加速度传感器的跌倒检测技术的研究与应用[D].上海:上海交通大学
    王春立.2003.面向大词汇量的连续中国手语识别系统的研究与实现[D].大连:大连理工大学
    王飞,罗志增.2004.基于AR模型和BP神经网络的表面EMG信号模式分类[J].华中科技大学学报(自然科学版),32(增刊):100-102
    王人成,郑双喜,蔡付文,姜力等.2008.基于表面肌电信号的手指运动模式识别系统[J].康复医学工程,v23(5).410-412
    徐峰,王志芳,王宝圣.1999.AR模型应用于振动信号趋势预测的研究[J].清华大学学报(自然科学版),v39(4).57-59
    徐慧娟.2012.自回归AR模型的整体最小二乘法分析研究[D].抚州:东华理工大学
    汪可,杨丽君,廖瑞金,邓小聘,周天春.2011.动态时间规整算法在局部放电模式识别中的应用[J].重庆大学学报,v34(12).54-60
    谢洪波,王志中,黄海.2004.表面肌电的支持向量机分类[J].北京生物医学工程,23(2):94-96,157.
    张旭.2010.基于表面肌电信号的人体动作识别与交互[D].合肥:中国科学技术大学
    赵光宙.1989.利用AR模型提取控制用肌电信号的特征[J].北京生物医学工程,v8(3).136-142
    赵章琰.2010.表面肌电信号检测和处理中若干关键技术研究[D].合肥:中国科学技术大学
    中华人民共和国卫生部.2003.中国卫生统计年鉴[M].北京:中国协和医科大学出版社,215-249
    中国残疾人联合会教育就业部,中国聋人协会.2003.中国手语(上下册)(修订本)[M].北京:华夏出版社
    Abdelkader M. F., Abd-Almageed W., Srivastava A., et al.2011. Silhouette-based gesture and action recognition via modeling trajectories on Riemannian shape manifolds [J]. Computer Vision and Image Understanding,115(3):439-455
    Abowd, G. D., Dey, A. K., Brown, P. J., et al.1999. Towards a better understanding of context and context-awareness [C]. Handheld and Ubiquitous Computing, Proceedings 1707:304-307.
    Akl, A. and Valaee, S.2010. Accelerometer-based gesture recognition via dynamic-time warping, affinity propagation, compressive sensing [C]. Acoustics Speech and Signal Processing (ICASSP),2010 IEEE International Conference on.
    Allen, F. R., Ambikairajah, E., Lovell, N. H., et al.2006. An Adapted Gaussian Mixture Model Approach to Accelerometry-Based Movement Classification Using Time-Domain Features [C]. Engineering in Medicine and Biology Society,28th Annual International Conference of the IEEE.
    Azizyan, M., Constandache, I. and Roy Choudhury, R.2009. SurroundSense:mobile phone localization via ambience fingerprinting [C]. Proceedings of the 15th annual international conference on Mobile computing and networking, ACM.
    Baltzakis, H., Pateraki, M. and Trahanias, P.2012. Visual tracking of hands, faces and facial features of multiple persons [J]. Machine Vision and Applications 23(6):1141-1157.
    Bao, L. and Intille, S. S.2004. Activity recognition from user-annotated acceleration data [C]. Pervasive Computing, Proceedings 3001:1-17
    Barbara T. Korel, S. G. M. K.2010. A survey on Context-aware Sensing for Body Sensor Networks [J]. Wireless Sensor Network 13
    Barralon, P., Noury, N. and Vuillerme, N.2005._Classification of daily physical activities from a single kinematic sensor, Shanghai, China, Institute of Electrical and Electronics Engineers Inc.
    Bercher, J. F. and Vignat, C.2000. Estimating the entropy of a signal with applications [J]. Ieee Transactions on Signal Processing 48(6):1687-1694.
    Bourke, A. K., O'Brien, J. V. and Lyons, G. M.2007. Evaluation of a threshold-based tri-axial accelerometer fall detection algorithm [J]. Gait & Posture 26(2):194-199.
    Brashear H., Henderson V., Park K. H., et al.2006. American sign language recognition in game development for deaf children[C]. Proceedings of the 8th international ACM SIGACCESS conference on Computers and accessibility.:79-86
    Brezmes, T., Gorricho, J.-L. and Cotrina, J. (2009). Activity recognition from accelerometer data on a mobile phone. Distributed computing, artificial intelligence, bioinformatics, soft computing, and ambient assisted living, Springer:796-799.
    Bricon-Souf, N. and Newman, C. R.2007. Context awareness in health care:A review. International Journal of Medical Informatics 76(1):2-12.
    Burghardt, C. and Kirste, T.2007._Inferring intentions in generic context-aware systems, Oulu, Finland, Association for Computing Machinery.
    Carrino, S., Mugellini, E., Abou Khaled, O., et al.2011. Gesture-Based Hybrid Approach for HCI in Ambient Intelligent Environmments [C]. Ieee International Conference on Fuzzy Systems (Fuzz 2011):86-93.
    Castellini C., Van Der Smagt P..2009. Surface EMG in advanced hand prosthetics [J]. Biological cybernetics,100(1):35-47
    Catarinucci, L., Colella, R., Esposito, A., et al.2012. RFID Sensor-Tags Feeding a Context-Aware Rule-Based Healthcare Monitoring System. Journal of Medical Systems 36(6):3435-3449.
    Chakrabarti, A., Do Ba, K. and Muthukrishnan, S.2006. Estimating entropy and entropy norm on data streams. Stacs 2006, Proceedings 3884:196-205.
    Chen, W., Wang, Z., Xie, H., et al.2007. Characterization of surface EMG signal based on fuzzy entropy. Neural Systems and Rehabilitation Engineering, IEEE Transactions on 15(2): 266-272.
    Chen, W. T., Zhuang, J., Yu, W. X., et al.2009. Measuring complexity using FuzzyEn, ApEn, and SampEn. Medical Engineering & Physics 31(1):61-68.
    Chen, X., Zhang, X., Zhao, Z. Y., et al.2007. Hand gesture recognition research based on surface EMG sensors and 2D-accelerometers. Eleventh IEEE International Symposium on Wearable Computers, Proceedings:11-14.
    Chen, X., Vuokko, L., Wang, K.-Q., et al.2009._Feasibility of building robust surface electromyography-based hand gesture interfaces, Minneapolis, MN, United states, IEEE Computer Society.
    Chen, Y. T. and Tseng, K. T.2007. Developing a multiple-angle hand gesture recognition system for human machine interactions. Iecon 2007:33rd Annual Conference of the Ieee Industrial Electronics Society, Vols 1-3, Conference Proceedings:489-492.
    Choi, J. H., Lee, J., Hwang, H. T., et al.2006._Estimation of activity energy expenditure:accelerometer approach. Engineering in Medicine and Biology Society,2005. IEEE-EMBS 2005.27th Annual International Conference of the, IEEE.
    De Luca, C. J.1978. Control of Upper-Limb Prostheses-Case for Neuroelectric Control. Journal of Medical Engineering & Technology 2(2):57-61.
    Dean, D., Lucey, P., Sridharan, S., et al.2007. Fused HMM-Adaptation of Multi-Stream HMMs for Audio-Visual Speech Recognition. Interspeech 2007:8th Annual Conference of the International Speech Communication Association, Vols 1-4:2272-2275.
    Derawi M. O., Nickel C., Bours P., et al.2010. Unobtrusive user-authentication on mobile phones using biometric gait recognition [C]. Intelligent Information Hiding and Multimedia Signal Processing (IIH-MSP), Sixth International Conference on. IEEE,:306-311
    Deng, L., Wang, Y.-Y., Wang, K., et al.2004._Speech and Language Processing for Multimodal Human-Computer Interaction, Kluwer Academic Publishers.
    Deutsch J. E., Borbely M., Filler J., et al.2008. Use of a low-cost, commercially available gaming console (Wii) for rehabilitation of an adolescent with cerebral palsy [J]. Physical therapy, 88(10):1196-1207.
    Dey, A. K.2001. Understanding and Using Context. Personal Ubiquitous Comput.5(1):4-7.
    Ding, L. Y. and Martinez, A. M.2009. Modelling and recognition of the linguistic components in American Sign Language. Image and Vision Computing 27(12):1826-1844.
    Dreuw P., Rybach D., Deselaers T., et al.2007. Speech recognition techniques for a sign language recognition system [J]. Hand,60:80
    Du, H. and Charbon, E.2008. A Virtual Keyboard System Based on Multi-Level Feature Matching. 2008 Conference on Human System Interactions, Vols 1 and 2:170-175.
    Ekstrom R. A., Osborn R. W., Hauer P. L..2008. Surface electromyographic analysis of the low back muscles during rehabilitation exercises [J]. The Journal of orthopaedic and sports physical therapy,38(12):736-745
    Englehart, K., Hudgins, B. and Parker, P. A.2001. A wavelet-based continuous classification scheme for multifunction myoelectric control. Ieee Transactions on Biomedical Engineering 48(3): 302-311.
    Fang, G., Gao, W. and Zhao, D.2004. Large vocabulary sign language recognition based on fuzzy decision trees. IEEE Transactions on Systems, Man, and Cybernetics Part A:Systems and Humans.34(3):305-314.
    Fang, G. L., Gao, W. and Zhao, D. B.2007. Large-vocabulary continuous sign language recognition based on transition-movement models. Ieee Transactions on Systems Man and Cybernetics Part a-Systems and Humans 37(1):1-9.
    Fang Y., Wang K., Cheng J., et al.2007. A real-time hand gesture recognition method [C]. Multimedia and Expo,2007 IEEE International Conference on.:995-998
    Fasel, B. and Luettin, J.2003. Automatic facial expression analysis:a survey. Pattern Recognition 36(1):259-275.
    Faundez-Zanuy, M.2007. On-line signature recognition based on VQ-DTW. Pattern Recognition 40(3): 981-992.
    Forbes T.2013. Mouse HCI Through Combined EMG and IMU [J].
    Fougner, A., Stavdahl, O., Kyberd, P. J., et al.2012. Control of Upper Limb Prostheses:Terminology and Proportional Myoelectric Control-A Review. Ieee Transactions on Neural Systems and Rehabilitation Engineering 20(5):663-677.
    Fougner A.2013. Robust, Coordinated and Proportional Myoelectric Control of Upper-Limb Prostheses [D]. Norweginn University of Science and Technology.
    Frigo, C. and Crenna, P.2009. Multichannel SEMG in clinical gait analysis:A review and state-of-the-art. Clinical Biomechanics 24(3):236-245.
    Gallo L, Placitelli A P, Ciampi M. Controller-free exploration of medical image data:Experiencing the Kinect[C]//Computer-Based Medical Systems (CBMS),2011 24th International Symposium on. IEEE,2011:1-6
    Gao W., Ma J., Wu J., et al.2000. Sign language recognition based on HMM/ANN/DP [J]. International journal of pattern recognition and artificial intelligence,14(05):587-602
    Gao, W., Fang, G. L., Zhao, D. B., et al.2004. Transition movement models for large vocabulary continuous sign language recognition. Sixth Ieee International Conference on Automatic Face and Gesture Recognition, Proceedings:553-558.
    Goldstein, M., Book, R., Alsio, G., et al.1999._Non-keyboard QWERTY touch typing:A portable input interface for the mobile user, Pittsburgh, PA, United states, Association for Computing Machinery.
    Gurkok, H. and Nijholt, A.2012. Brain-Computer Interfaces for Multimodal Interaction:A Survey and Principles. International Journal of Human-Computer Interaction 28(5):292-307.
    Ham, Y. C. and Yu, S.2009. Developing a Smart Camera for Gesture Recognition in HCI Applications. Isce:2009 Ieee 13th International Symposium on Consumer Electronics, Vols 1 and 2: 701-705.
    Han, J. W, Awad, G. and Sutherland, A.2009. Modelling and segmenting subunits for sign language recognition based on hand motion analysis. Pattern Recognition Letters 30(6):623-633.
    Hijaz, F., Afzal, N., Ahmad, T.. et al.2010._Survey of fall detection and daily activity monitoring techniques, Karachi, Pakistan, IEEE Computer Society.
    Hoareau, C. and Satoh, I.2009. Modeling and Processing Information for Context-Aware Computing: A Survey. New Generation Computing 27(3):177-196.
    Holden E. J., Lee G., Owens R..2005. Australian sign language recognition [J]. Machine Vision and Applications,16(5):312-320
    Hu, X. and Nenov, V.2004. Multivariate AR modeling of electro myography for the classification of upper arm movements. Clinical Neurophysiology 115(6):1276-1287.
    Huang C N, Chiang C Y, Chang J S, et al.2009. Location-aware fall detection system for medical care quality improvement [C]. Multimedia and Ubiquitous Engineering, Third International Conference on. IEEE,:477-480
    Huang, C. N., Chiang, C. Y, Chen, G. C., et al.2010. Fall Detection System for Healthcare Quality Improvement in Residential Care Facilities. Journal of Medical and Biological Engineering 30(4):247-252.
    Huang, Y., Zhu, Y. X., Xu, G. Y., et al.1998. Video camera-based dynamic gesture recognition for HCI. Icsp '98:1998 Fourth International Conference on Signal Processing, Proceedings, Vols 1 and Ii:904-907.
    Huang, Y. H., Englehart, K. B., Hudgins, B., et al.2005. A Gaussian mixture model based classification scheme for myoelectric control of powered upper limb prostheses. leee Transactions on Biomedical Engineering 52(11):1801-1811.
    Hudgins, B., Parker, P. and Scott, R. N.1993. A New Strategy for Multifunction Myoelectric Control, leee Transactions on Biomedical Engineering 40(1):82-94.
    Jaimes, A. and Sebe, N.2005. Multimodal human computer interaction:A survey. Computer Vision in Human-Computer Interaction, Proceedings 3766:1-15.
    Jaimes, A. and Sebe, N.2007. Multimodal human-computer interaction:A survey. Computer Vision and Image Understanding 108(1-2):116-134.
    Jansen B., Deklerck R..2006. Context aware inactivity recognition for visual fall detection [C]. Pervasive Health Conference and Workshops, IEEE,:1-4
    Jantaraprim, P., Phukpattaranont, P., Limsakul, C., et al. (2009). Evaluation of fall detection for the elderly on a variety of subject groups. Proceedings of the 3rd International Convention on Rehabilitation Engineering \& Assistive Technology. Singapore, ACM.
    Jiang, M., Wang, R., Wang, J., et al.2006._A method of recognizing finger motion using wavelet transform of surface EMG signal. Engineering in Medicine and Biology Society,2005. IEEE-EMBS 2005.27th Annual International Conference of the, IEEE.
    Kang, D. O., Ha, K. and Lee, J.2008. A Context Aware System for Personalized Services using Wearable Biological Signal Sensors.2008 International Conference on Control, Automation and Systems, Vols 1-4:786-789.
    Karantonis, D. M., Narayanan, M. R., Mathie, M., et al.2006. Implementation of a real-time human movement classifier using a triaxial accelerometer for ambulatory monitoring. Ieee Transactions on Information Technology in Biomedicine 10(1):156-167.
    Kaufman, M., Zurcher, U. and Sung, P. S.2007. Entropy of electromyography time series. Physica a-Statistical Mechanics and Its Applications 386(2):698-707.
    Kim J. B., Park K. H., Bang W. C., et al.2002. Continuous gesture recognition system for Korean sign language based on fuzzy logic and hidden Markov model [C]. Proceedings of the 2002 IEEE International Conference on.2:1574-1579
    Kim, J., Wagner, J., Rehm, M., et al.2008. Bi-channel Sensor Fusion for Automatic Sign Language Recognition.2008 8th Ieee International Conference on Automatic Face & Gesture Recognition (Fg 2008), Vols 1 and 2:647-652.
    Kim J. H., Thang N. D., Kim T. S..2009.3-d hand motion tracking and gesture recognition using a data glove [C]. Industrial Electronics. IEEE International Symposium on.:1013-1018
    Kim J. S., Jang W., Bien Z..1996. A dynamic gesture recognition system for the Korean sign language (KSL) [J]. Systems, Man, and Cybernetics, Part B:Cybernetics, IEEE Transactions on,26(2): 354-359
    Kim, S., Jung, S., Song, J. G., et al.2010. Context-Aware Based Efficient Training System Using Augmented Reality and Gravity Sensor for Healthcare Services. Advanced Computer Science and Information Technology 74:65-71.
    Kim, S. H., Choi, H. I. and Rhee, P. K.1998. Head gesture recognition using HMMs. Cisst'98: Proceedings of the International Conference on Imaging Science, Systems and Technology: 9-16.
    Kim, S. H., Yang, H. J. and Ng, K. S.2009. Temporal Sign Language Analysis Based on DTW and Incremental Model.
    Konrad P.,2005. The ABC of EMG [M]. Version 1.0. Noraxon. Inc. USA
    Kosmidou, V. E. and Hadjileontiadis, L. J.2009. Sign language recognition using intrinsic-mode sample entropy on sEMG and accelerometer data. Ieee Transactions on Biomedical Engineering 56(12):2879-2890.
    Kosmidou, V. E. and Hadjileontiadis, L. I.2010. Using sample entropy for automated sign language recognition on sEMG and accelerometer data. Medical & Biological Engineering & Computing 48(3):255-267
    Kwon, S., Park, H. S., Stanley, C. J., et al.2012. A Practical Strategy for sEMG-Based Knee Joint Moment Estimation During Gait and Its Validation in Individuals With Cerebral Palsy. Ieee Transactions on Biomedical Engineering 59(5):1480-1487.
    Lake, D. E., Richman, J. S., Griffin, M. P., et al.2002. Sample entropy analysis of neonatal heart rate variability. American Journal of Physiology-Regulatory, Integrative and Comparative Physiology 283(3):R789-R797.
    Lamolle, M., Mancini, M., Pelachaud, C., et al.2005. Contextual factors and adaptative multimodal human-computer interaction:Multi-level specification of emotion and expressivity in Embodied Conversational Agents. Modeling and Using Context, Proceedings 3554:225-239.
    Lenzi, T., De Rossi, S. M. M., Vitiello, N., et al.2012. Intention-Based EMG Control for Powered Exoskeletons. Ieee Transactions on Biomedical Engineering 59(8):2180-2190.
    Li Q., Zhou G., Stankovic J. A..2008. Accurate, fast fall detection using posture and context information [C]. Proceedings of the 6th ACM conference on Embedded network sensor systems,:443-444
    Li, Q., Stankovic, J. A., Hanson, M. A., et al.2009. Accurate, Fast Fall Detection Using Gyroscopes and Accelerometer-Derived Posture Information. Sixth International Workshop on Wearable and Implantable Body Sensor Networks, Proceedings:138-143.
    Li Y, Chen X, Zhang X, et al.2011. Interpreting sign components from accelerometer and sEMG data for automatic sign language recognition[C]. Engineering in Medicine and Biology Society, EMBC,2011 Annual International Conference of the IEEE.:3358-3361.
    Li, Y., Chen, X., Zhang, X., et al.2012. A Sign-Component-Based Framework for Chinese Sign Language Recognition Using Accelerometer and sEMG Data. Ieee Transactions on Biomedical Engineering 59(10):2695-2704.
    Lichtenauer, J. F., Hendriks, E. A. and Reinders, M. J. T.2008. Sign language recognition by combining statistical DTW and independent classification. Ieee Transactions on Pattern Analysis and Machine Intelligence 30(11):2040-2046.
    Liu, C. J. and Wechsler, H.2000. Robust coding schemes for indexing and retrieval from large face databases. Ieee Transactions on Image Processing 9(1):132-137.
    Liu, J., Zhong, L., Wickramasuriya, J., et al.2009. uWave:Accelerometer-based personalized gesture recognition and its applications. Pervasive and Mobile Computing 5(6):657-675.
    Loconte, L. R. and Merletti, R.1995. Advances in Processing of Surface Myoelectric Signals.2. Medical & Biological Engineering & Computing 33(3):373-384.
    Lovell N. H., Wang N., Ambikairajah E., et al.2007. Accelerometry based classification of walking patterns using time-frequency analysis [C]. Engineering in Medicine and Biology Society, 2007. EMBS 2007.29th Annual International Conference of the IEEE.:4899-4902
    Luo, S. and Hu, Q.2004._A dynamic motion pattern analysis approach to fall detection, Singapore, Singapore, Institute of Electrical and Electronics Engineers Computer Society.
    Luo, Z. Z., Wang, F. and Ma, W. J.2006. Pattern classification of surface electromyography based on AR model and high-order neural network. Proceedings of the 2006 IEEE/ASME International Conference on Mechatronic and Embedded Systems and Applications:358-363.
    Manabe, H. and Zhang, Z.2004. Multi-stream HMM for EMG-based speech recognition. Proceedings of the 26th Annual International Conference of the Ieee Engineering in Medicine and Biology Society, Vols 1-7 26:4389-4392.
    Marzencki, M., Hung, B., Lin, P., et al.2010._Context-aware physiological data acquisition and processing with wireless sensor networks, Ottawa, ON, Canada, IEEE Computer Society.
    Mathie, M. J., Coster, A. C. F., Lovell, N. H., et al.2004. A pilot study of long-term monitoring of human movements in the home using accelerometry. Journal of Telemedicine and Telecare 10(3):144-151.
    Maurer, U., Smailagic, A., Siewiorek, D. P., et al.2006. Activity recognition and monitoring using multiple sensors on different body positions. BSN 2006:International Workshop on Wearable and Implantable Body Sensor Networks, Proceedings:113-116.
    Mehra, P.2012. Context-Aware Computing Beyond Search and Location-Based Services Introduction. Ieee Internet Computing 16(2):12-16.
    Merletti, R. and Loconte, L. R.1995. Advances in Processing of Surface Myoelectric Signals.1. Medical & Biological Engineering & Computing 33(3):362-372.
    Mitra, S. and Acharya, T.2007. Gesture recognition:A survey. Ieee Transactions on Systems Man and Cybernetics Part C-Applications and Reviews 37(3):311-324.
    Naik, G. R., Kumar, D. K. and Arjunan, S. P.2008._Multi modal gesture identification for HCI using surface EMG, Tampere, Finland, Association for Computing Machinery.
    Narayanan, M. R., Lord, S. R., Budge, M. M., et al.2007. Falls management:Detection and prevention, using a waist-mounted triaxial accelerometer.2007 Annual International Conference of the Ieee Engineering in Medicine and Biology Society, Vols 1-16:4037-4040.
    Ofstad, A., Nicholas, E., Szcodronski, R., et al.2008._Aampl:Accelerometer augmented mobile phone localization. Proceedings of the first ACM international workshop on Mobile entity localization and tracking in GPS-Iess environments, ACM.
    Oikonomidis I, Kyriazis N, Argyros A. Efficient model-based 3d tracking of hand articulations using kinect[C]//British Machine Vision Conference.2011:101.1-101.11
    Ong, S. C. W. and Ranganath, S.2005. Automatic sign language analysis:A survey and the future beyond lexical meaning. Ieee Transactions on Pattern Analysis and Machine Intelligence 27(6): 873-891.
    Oskoei, M. A. and Hu, H. S.2007. Myoelectric control systems-A survey. Biomedical Signal Processing and Control 2(4):275-294.
    Oz, C. and Leu, M. C.2007. Linguistic properties based on American Sign Language isolated word recognition with artificial neural networks using a sensory glove and motion tracker. Neurocomputing 70(16-18):2891-2901.
    Oz, C. and Leu, M. C.2011. American Sign Language word recognition with a sensory glove using artificial neural networks. Engineering Applications of Artificial Intelligence 24(7): 1204-1213.
    Park, J. H., Kim, H. J. and Kang, S. J.2006. Validation of the AMP331 monitor for assessing energy expenditure of free-living physical activity. Research Quarterly for Exercise and Sport 77(1): A40-A40.
    Pavlovic, V. I., Sharma, R. and Huang, T. S.1997. Visual interpretation of hand gestures for human-computer interaction:A review. leee Transactions on Pattern Analysis and Machine Intelligence 19(7):677-695.
    Pincus, S.1995. Approximate entropy (ApEn) as a complexity measure. Chaos:An Interdisciplinary Journal of Nonlinear Science 5(1):110-117.
    Popescu, M., Li, Y., Skubic, M., et al.2008. An Acoustic Fall Detector System that Uses Sound Height Information to Reduce the False Alarm Rate.2008 30th Annual International Conference of the leee Engineering in Medicine and Biology Society, Vols 1-8:4628-4631.
    Premaratne, P., Ajaz, S. and Premaratne, M. (2011). Hand Gesture Tracking and Recognition System for Control of Consumer Electronics. Advanced Intelligent Computing Theories and Applications:With Aspects of Artificial Intelligence. Huang, D., Y. Gan, P. Guptaet al.6839: 588-593.
    Rabiner, L. R.1989. A Tutorial on Hidden Markov-Models and Selected Applications in Speech Recognition. Proceedings of the Ieee 77(2):257-286.
    Raj anna, V. D.2013. Framework for Accelerometer Based Gesture Recognition and Seamless Integration with Desktop Applications. International Journal of Scientific and Research Publications 3(1):1-4.
    Ramdani, S., Seigle, B., Lagarde, J., et al.2009. On the use of sample entropy to analyze human postural sway data [J]. Medical Engineering & Physics 31(8):1023-1031.
    Ravi N., Dandekar N., Mysore P., et al.1999,2005. Activity recognition from accelerometer data [C]. Proceedings of the national conference on artificial intelligence. Menlo Park, CA; Cambridge, MA; London; AAAI Press; MIT Press;1999,2005,20(3):1541.
    Ren Z., Meng J., Yuan J., et al.2011. Robust hand gesture recognition with kinect sensor [C]. Proceedings of the 19th ACM international conference on Multimedia.:759-760
    Richman, J. S. and Moorman, J. R.2000. Physiological time-series analysis using approximate entropy and sample entropy. American Journal of Physiology-Heart and Circulatory Physiology 278(6): H2039-H2049.
    Roy, S. H., Cheng, M. S., Chang, S. S., et al.2009. A Combined sEMG and Accelerometer System for Monitoring Functional Activity in Stroke. Ieee Transactions on Neural Systems and Rehabilitation Engineering 17(6):585-594.
    Salvadora, S. and Chan, P.2007. Toward accurate dynamic time warping in linear time and space. Intelligent Data Analysis 11(5):561-580.
    Saponas, T. S., Tan, D. S., Morris, D., et al.2008. Demonstrating the Feasibility of Using Forearm Electromyography for Muscle-Computer Interfaces.
    Saposnik G., Teasell R., Mamdani M., et al.2010. Effectiveness of Virtual Reality Using Wii Gaming Technology in Stroke Rehabilitation A Pilot Randomized Clinical Trial and Proof of Principle [J]. Stroke,41(7):1477-1484
    Schilit, B. N., Adams, N. and Want, R. (1994). Context-Aware Computing Applications. United States: 13p.
    Schilit, B. N., Hilbert, D. M. and Trevor, J.2002. Context-aware communication. leee Wireless Communications 9(5):46-54.
    Schlomer T., Poppinga B., Henze N., et al.2008. Gesture recognition with a Wii controller [C]. Proceedings of the 2nd international conference on Tangible and embedded interaction.:11-14
    Sheikhan, M., Gharavian, D. and Ashoftedel, F.2012. Using DTW neural-based MFCC warping to improve emotional speech recognition [J]. Neural Computing & Applications 21(7): 1765-1773.
    Shi, G. Y., Zou, Y. X., Jin, Y. F., et al.2009. Towards HMM based Human Motion Recognition using MEMS Inertial Sensors [C].2008 Ieee International Conference on Robotics and Biomimetics, Vols 1-4:1762-1766.
    Shuangcheng, L., Qiaofu, Z., Shaohong, W., et al.2006. Measurement of climate complexity using sample entropy [J]. International journal of climatology 26(15):2131-2139.
    Sparto, P. J., Parnianpour, M., Barria, E. A., et al.1999. Wavelet analysis of electromyography for back muscle fatigue detection during isokinetic constant-torque exertions [J]. Spine 24(17): 1791-1798.
    Sparto, P. J., Parnianpour, M., Barria, E. A., et al.2000. Wavelet and short-time Fourier transform analysis of electromyography for detection of back muscle fatigue [J]. leee Transactions on Rehabilitation Engineering 8(3):433-436.
    Sreethar, S. and Baburaj, E.2013. Investigation on Context-Aware Service Discovery in Pervasive Computing Environments [J]. Intelligent Informatics 182:393-403.
    Starner, T, Weaver, J. and Pentland, A.1998. Real-time American sign language recognition using desk and wearable computer based video [J]. Ieee Transactions on Pattern Analysis and Machine Intelligence 20(12):1371-1375.
    Stiefmeier, T., Ogris, G., Junker, H., et al.2006. Combining motion sensors and ultrasonic hands tracking for continuous activity recognition in a maintenance scenario [C]. Tenth Ieee International Symposium on Wearable Computers, Proceedings:97-104.
    Tapia, E. M. and Intille, S. S.2007. Real-time recognition of physical activities and their intensities using wireless accelerometers and a heart rate monitor [C]. Eleventh IEEE International Symposium on Wearable Computers, Proceedings:37-40.
    Thomas, O.2010. Wearable sensor activity analysis using semi-Markov models with a grammar [J]. Pervasive and Mobile Computing 6(3):9.
    Usakli, A. B. and Gurkan, S.2010. Design of a Novel Efficient Human-Computer Interface:An Electrooculagram Based Virtual Keyboard [J]. Ieee Transactions on Instrumentation and Measurement 59(8):2099-2108.
    Usero, L., Arroyo, A. and Calvo, J.2007. Context information for understanding forest fire using evolutionary computation [C]. Nature Inspired Problem-Solving Methods in Knowledge Engineering, Pt 2, Proceedings 4528:271-276.
    Vogler C., Metaxas D..2001. A framework for recognizing the simultaneous aspects of american sign language [J]. Computer Vision and Image Understanding,81(3):358-384
    Vogler, C. and Metaxas, D.2003. Handshapes and movements:Multiple-channel American Sign Language recognition [J]. Gesture-Based Communication in Human-Computer Interaction 29(15):247-258.
    Volmer, A., Kruger, N. T. and Orglmeister, R.2009. Posture and Motion Detection Using Acceleration Data for Context Aware Sensing in Personal Healthcare Systems [J]. World Congress on Medical Physics and Biomedical Engineering, Vol 25, Pt 5 25:71-74.
    Wang, C. L., Gao, W. and Xuan, Z. G.2001. A real-time large vocabulary continuous recognition system for Chinese Sign Language [C]. Advances in Mutlimedia Information Processing: 150-157
    Wang, C. L., Gao, W. and Shan, S. G.2002. An approach based on phonemes to large vocabulary Chinese sign language recognition [C]. Fifth Ieee International Conference on Automatic Face and Gesture Recognition, Proceedings:411-416.
    Wang, W. H., Chen, X. A., Wang, K. Q., et al.2009. Dynamic Gesture Recognition based on Multiple Sensors Fusion Technology [C]. Embc:2009 Annual International Conference of the Ieee Engineering in Medicine and Biology Society, Vols 1-20:7014-7017.
    Ward, J. A., Lukowicz, P., Troster, G., et al.2006. Activity recognition of assembly tasks using body-worn microphones and accelerometers [J]. Ieee Transactions on Pattern Analysis and Machine Intelligence 28(10):1553-1567.
    Wheeler, K. R. and Jorgensen, C. C.2003. Gestures as input:Neuroelectric joysticks and keyboards [C]. Ieee Pervasive Computing 2(2):56-61.
    Woodford H., Price C..2007. EMG biofeedback for the recovery of motor function after stroke [J]. Cochrane Database Syst Rev,2
    Wu, G. and Xue, S. W.2008. Portable preimpact fall detector with inertial sensors [C]. Ieee Transactions on Neural Systems and Rehabilitation Engineering 16(2):178-183.
    Wu, Q., Zhao, J. J. and Zeng, H.2012. A Context-Aware Based Application of Smart Vehicle Space for Ubiquitous Computing [C]. Advanced Technology in Teaching-Proceedings of the 2009 3rd International Conference on Teaching and Computational Science (Wtcs 2009), Vol 2 117: 181-189.
    Xie, H. B., Zheng, Y. P., Guo, J. Y., et al.2010. Cross-fuzzy entropy:A new method to test pattern synchrony of bivariate time series [J]. Information Sciences 180(9):1715-1724.
    Xu, G., Zhang, X., Yu, H., et al. Complexity Analysis of EEG Under Magnetic Stimulation at Acupoints [J]. Applied Superconductivity, IEEE Transactions on 20(3):1029-1032.
    Xu, P., Ma, T. Y. and Jin, Y. J.2006. Entropy-based method to evaluate the data integrity [J]. Nuclear Instruments & Methods in Physics Research Section a-Accelerators Spectrometers Detectors and Associated Equipment 569(2):412-415.
    Yang, C. C. and Hsu, Y. L.2007. Algorithm design for real-time physical activity identification with accelerometry measurement. Iecon 2007:33rd Annual Conference of the Ieee Industrial Electronics Society, Vols 1-3, Conference Proceedings:2996-3000.
    Yang, J. Y, Wang, J. S. and Chen, Y. P.2008. Using acceleration measurements for activity recognition: An effective learning algorithm for constructing neural classifiers. Pattern Recognition Letters 29(16):2213-2220.
    Yang, J. Y, Chen, Y. P., Lee, G. Y., et al.2007. Activity recognition using one triaxial accelerometer:A neuro-fuzzy classifier with feature reduction. Entertainment Computing-ICEC 2007 4740: 395-400.
    Yang, R. D., Sarkar, S. and Loeding, B.2010. Handling Movement Epenthesis and Hand Segmentation Ambiguities in Continuous Sign Language Recognition Using Nested Dynamic Programming. Ieee Transactions on Pattern Analysis and Machine Intelligence 32(3):462-477.
    Yaniv, R. and Burshtein, D.2003. An enhanced dynamic time warping model for improved estimation of DTW parameters. Ieee Transactions on Speech and Audio Processing 11(3):216-228.
    Yin, J., Yang, Q. and Pan, J. J.2008. Sensor-based abnormal human-activity detection. Ieee Transactions on Knowledge and Data Engineering 20(8):1082-1090.
    Yin, P., Starner, T., Hamilton, H., et al.2009. Learning the Basic Units in American Sign Language Using Discriminative Segmental Feature Selection.2009 Ieee International Conference on Acoustics, Speech, and Signal Processing, Vols 1-8, Proceedings:4757-4760.
    Yonghong, H., Englehart, K. B., Hudgins, B., et at.2005. A Gaussian mixture model based classification scheme for myoelectric control of powered upper limb prostheses. Biomedical Engineering, IEEE Transactions on 52(11):1801-1811.
    Yu, K. J., Cha, K. M. and Shin, H. C.2009. Maximum Likelihood Method for Finger Motion Recognition from sEMG Signals.13th International Conference on Biomedical Engineering, Vols 1-3 23(1-3):452-455.
    Zeng, Z. H., Pantic, M., Roisman, G.1., et al.2009. A Survey of Affect Recognition Methods:Audio, Visual, and Spontaneous Expressions. Ieee Transactions on Pattern Analysis and Machine Intelligence 31(1):39-58.
    Zhang, T., Wang, J., Xu, L., et al.2006. Fall detection by wearable sensor and one-class SVM algorithm. Intelligent Computing in Signal Processing and Pattern Recognition 345:858-863.
    Zhang, X., Chen, X., Li, Y., et al.2011. A Framework for Hand Gesture Recognition Based on Accelerometer and EMG Sensors. Ieee Transactions on Systems Man and Cybernetics Part a-Systems and Humans 41(6):1064-1076.
    Zhang, X., Chen, X., Zhao, Z. Y., et al.2007. Research on gesture definition and electrode placement in pattern recognition of hand gesture action SEMG. Medical Biometrics, Proceedings 4901: 33-40.
    Zhao, Z. Y, Chen, X., Zhang, X., et al.2007. Study on online gesture sEMG recognition. Advanced Intelligent Computing Theories and Applications:With Aspects of Theoretical and Methodological Issues 4681:1257-1265.
    Zheng, D., Jia, Y, Zhou, P., et al.2007. Context-aware middleware support for component based applications in pervasive computing. Advanced Parallel Processing Technologies, Proceedings 4847:161-171.
    Zhong, S. and Ghosh, J.2002. HMMs and coupled HMMs for multi-channel EEG classification. Proceeding of the 2002 International Joint Conference on Neural Networks, Vols 1-3: 1154-1159.
    Zhou, F., Jiao, J. X., Chen, S. L., et al.2009. A Context-Aware Information Model for Elderly Homecare Services in a Smart Home.
    Zhou, Y., Jing, L. and Cheng, Z.2012. Automatic Recognition of Finger Gestures Using a Tri-Axis Accelerometer. International Journal of Computer Theory and Engineering 4(6):967-970.
    Zhu, C. and Sheng, W. H.2011. Wearable Sensor-Based Hand Gesture and Daily Activity Recognition for Robot-Assisted Living. Ieee Transactions on Systems Man and Cybernetics Part a-Systems and Humans 41(3):569-573.