Please wait a minute...
Computer Technology     
Design and application of vibration emoticons in wearable computing devices
CHEN Shi, ZHENG Kai Hong, SUN Ling Yun, LI Yan
Modern Industrial Design Institute, Zhejiang University, Hangzhou 310027, China
Download:   PDF(1166KB) HTML
Export: BibTeX | EndNote (RIS)      


The possibility of using vibration to express emotions in wearable computing devices was investigated for the tactile stimulation by vibration. Based on the analysis of content like tactile stimulation, tactile expression and vibrayion parameters, six vibration models were designed by configuring the intensity, rhythm and duration of vibration. The six models, which were called vibration emoticons, were corresponding to six basic emotions, including anger, fear, funny, sadness, disgust and surprise. The vibration emoticons were used on four body parts, where wearable computing devices were often applied, including finger, wrist, upper arm and ankle. Then the recognition rates of the vibration emoticons were compared. Comparison of users’ recognition accuracies was used to see if they could link emotions that they were given from each body part to emotions from each vibration model. A wearable vibration ring was designed to evaluate the emoticons’ effectiveness by exploring the role of vibration emoctions on emotion expression in long distance communication scenario. Results show that the wearable computing devices with vibration emoticons can improve the recognition accuracy of emotions and enhance the perceptual intensity of emotions by expressing emotion through vibration stimulation. Therefore, the vibration emotions can be applied to wearable computing devices to express emotional features in the form of vibration, so as to improve the efficiency of emotion transfer.

Published: 31 December 2015
CLC:  TP 391.42  
Cite this article:

CHEN Shi, ZHENG Kai Hong, SUN Ling Yun, LI Yan. Design and application of vibration emoticons in wearable computing devices. JOURNAL OF ZHEJIANG UNIVERSITY (ENGINEERING SCIENCE), 2015, 49(12): 2298-2304.

URL:     OR



[1] ROGGEN D, MAGNENAT S, WAIBEL M, et al. Wearable computing [J]. Robotics and Automation Magazine, 2011, 18(2): 83-95.
[2] MANN S, HUANG J, JANZEN R, et al. Blind navigation with a wearable range camera and vibrotactile helmet [C] ∥ Proceedings of the 19th ACM International Conference on Multimedia. Scottsdale: ACM, 2011: 1325-1328.
[3] WAHL F, AMFT O, FREUND M. Using smart eyeglasses as a wearable game controller [C] ∥ Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers. Osaka: ACM, 2015: 377-380.
[4] BIEBER G, KIRSTE T, URBAN B. Ambient interaction by smart watches [C] ∥ Proceedings of the 5th International Conference on PErvasive Technologies Related to Assistive Environments. Heraklion: ACM, 2012: 3-9.
[5] ZUBAIR M, YOON C, Kim H, et al. Smart wearable band for stress detection [C] ∥ 2015 5th International Conference on IT Convergence and Security (ICITCS). Kuala Lumpur: IEEE, 2015: 1-4.
[6] 陈东义.可穿戴式计算机的发展与趋势(Ⅰ)[J].重庆大学学报:自然科学版,2000(3): 119-124.
CHEN Dong yi. The evolution and trend of wearable computer [J]. Journal of Chongqing University: Natural Science Edition, 2000(3): 119-124.
[7] MATTHIES D J C. InEar BioFeedController: a headset for hands free and eyes free interaction with mobile devices [C] ∥ CHI′13 Extended Abstracts on Human Factors in Computing Systems. Paris: ACM, 2013: 1293-1298.
[8] HARRISON C, BENKO H, WILSON A D. OmniTouch: wearable multitouch interaction everywhere [C] ∥ Proceedings of the 24th annual ACM symposium on User interface software and technology. SantaBarbara: ACM, 2011: 441-450.
[9] SEEHRA J S, VERMA A, RAMANI K. ChiroBot: modularrobotic manipulation via spatial hand gestures [C] ∥ Proceedings of the 2014 Conference on Interaction Design and Children. Aarhus: ACM, 2014: 209-212.
[10] MISTRY P, MAES P, CHANG L. WUW wear Ur world: a wearable gestural interface [C] ∥ CHI′09 Extended Abstracts on Human Factors in Computing Systems.Boston: ACM, 2009: 4111-4116.
[11] 王妍,吴斯一.触觉传感:从触觉意象到虚拟触觉[J].哈尔滨工业大学学报:社会科学版,2011, 13(5): 93-98.
WANG Yan, WU Si yi. Haptic sensing technology: from haptic images to the haptics [J].  Journal of Harbin Institute of Technology: Social Sciences Edition, 2011, 13(5): 93-98.
[12] TIWANA M I, REDMOND S J, LOVELL N H. A review of tactile sensing technologies with applications in biomedical engineering [J]. Sensors and Actuators A: physical. 2012, 179: 17-31.
[13] 郝飞,卢伟,宋爱国,等.信息触觉表达技术的研究现状与应用[J].测控技术,2011, 30(1): 6-9.
HAO Fei, LU Wei, SONG Ai guo, et al. Recent developments and applications of tactile information display technology [J]. Measurement and Control Technology, 2011, 30(1): 6-9.
[14] WEIGEL M, MEHTA V, STEIMLE J. More than touch: understanding how people use skin as an input surface for mobile computing [C] ∥ Proceedings of the 32nd Annual ACM Conference on Human Factors in Computing Systems. Toronto: ACM, 2014: 179-188.
[15] BAILENSON J N, YEE N, BRAVE S, et al. Virtual interpersonal touch: expressing and recognizing emotions through haptic devices [J]. Human Computer Interaction, 2007, 22(3): 325-353.
[16] SMITH J, MACLEAN K. Communicating emotion through a haptic link: design space and methodology [J]. International Journal of Human Computer Studies, 2007, 65(4): 376-387.
[17] NAKATSUMA K, HOSHI T, TORIGOE I. Haptic emoticon: haptic content creation and sharing system to enhancing text based communication [J]. Proceedings of SICE 2013, 2013: 218-222.
[18] HUISMAN G, DARRIBA FREDERIKS A. Towards tactile expressions of emotion through mediated touch [C] ∥ CHI′13 Extended Abstracts on Human Factors in Computing Systems. Paris: ACM, 2013: 1575-1580.
[19] ARAFSHA F, ALAM K M, El SADDIK A. EmoJacket: Consumer centric wearable affective jacket to enhance emotional immersion [C] ∥ 2012 International Conference on Innovations in Information Technology (IIT). AbuDhabi: IEEE, 2012: 350-355.
[20] SHIN H, LEE J, PARK J, et al. A tactile emotional interface for instant messenger chat [J]. Human Interface and the Management of Information. Interacting in Information Environments, Berlin Heidelberg: Springer, 2007: 166-175.
[21] 苏波,张桔,韩雪.基于微振动马达的无阀微泵的研究[J].传感器与微系统,2010 (9): 12-14.
SU Bo, ZHANG Ju, HAN Xue. Study on valveless micropump based on micro vibration motor [J]. Transducer and Microsystem Technologies, 2010 (9): 12-14.
[22] SEIFI H, MACLEAN K E. A first look at individuals’ affective ratings of vibrations [C] ∥ 2013 International Conference on World Haptics Conference (WHC). Daejeon: IEEE, 2013: 605-610.
[23] GUNTHER E, O’MODHRAIN S. Cutaneous grooves: composing for the sense of touch [J]. Journal of New Music Research, 2003, 32(4): 369-381.
[24] TERNES D, MACLEAN K E. Designing large sets of haptic icons with rhythm [M] ∥ Haptics: Perception, Devices and Scenarios. Berlin Heidelberg: Springer, 2008: 199-208.

[1] SUN Ling-yun, HE Bo-wei, LIU Zheng, YANG Zhi-yuan. Speech emotion recognition based on information cell[J]. JOURNAL OF ZHEJIANG UNIVERSITY (ENGINEERING SCIENCE), 2015, 49(6): 1001-1009.