Please wait a minute...
Journal of ZheJiang University (Engineering Science)  2023, Vol. 57 Issue (4): 683-692    DOI: 10.3785/j.issn.1008-973X.2023.04.005
    
Tactile slip detection method based on neuromorphic modeling
Chao-fan ZHANG1,3(),Yi-ming QIAO2,Lu CAO2,*(),Zhi-gang WANG2,Shao-wei CUI1,Shuo WANG1,3
1. State Key Laboratory of Multimodal Artificial Intelligence Systems, Institute of Automation, Chinese Academy of Sciences, Beijing 100190, China
2. Intel Labs China, Beijing 100190, China
3. School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing 100049, China
Download: HTML     PDF(1475KB) HTML
Export: BibTeX | EndNote (RIS)      

Abstract  

A tactile perception platform was constructed by focusing on neuromorphic-based tactile perception in order to analyze slip detection methods based on spiking neural networks (SNNs). The GelStereo sensor was used to capture the tactile information flow in the form of markers displacement field on the contact surface. Address-displacement representation (ADR) and address-event representation (AER) were analyzed to pulse-encode the markers displacement field. Slip detection networks were proposed based on the spike response model (SRM). The proposed network was deployed in the neuromorphic hardware - Intel Loihi chip. The experimental results show that the proposed SNN method with ADR achieved 94.8% accuracy and 95.7% F1 score in SRM, 93.8% accuracy and 94.8% F1 score in Loihi model (a specialized SRM for Loihi hardware). The well-trained SNNs for slip detection achieved comparable performance with artificial neural networks (ANNs) both on accuracy and F1 score with less inference time, and showed a significant advantage in power consumption.



Key wordsbrain-inspired information processing      spiking neural network      artificial neural network      tactile perception      neuromorphic      slip detection     
Received: 07 December 2022      Published: 21 April 2023
CLC:  TP 242  
Fund:  科技创新 2030—“新一代人工智能”重大项目(2018AAA0103003)
Corresponding Authors: Lu CAO     E-mail: zhangchaofan2020@ia.ac.cn;lu.cao@intel.com
Cite this article:

Chao-fan ZHANG,Yi-ming QIAO,Lu CAO,Zhi-gang WANG,Shao-wei CUI,Shuo WANG. Tactile slip detection method based on neuromorphic modeling. Journal of ZheJiang University (Engineering Science), 2023, 57(4): 683-692.

URL:

https://www.zjujournals.com/eng/10.3785/j.issn.1008-973X.2023.04.005     OR     https://www.zjujournals.com/eng/Y2023/V57/I4/683


基于神经形态的触觉滑动感知方法

聚焦基于神经形态的触觉感知,构建触觉感知实验平台,研究基于脉冲神经网络的接触物体滑动检测方法. 使用GelStereo触觉传感器采集接触表面标记点位移场触觉信息流,采用2种方式(地址位移表示(ADR)和地址事件表示(AER))对位移场流进行脉冲编码. 基于脉冲响应模型(SRM)构建滑动检测网络,在英特尔神经形态硬件Loihi上完成了网络部署. 实验结果表明,基于地址位移表示的脉冲响应模型准确率达到94.8%,F1分数达到95.7%. Loihi模型(针对神经形态硬件Loihi实现的特化脉冲响应模型)准确率达到93.8%,F1分数达到94.8%. 所构建的脉冲神经网络在触觉滑动感知任务中实现了与人工神经网络(ANNs)相比拟的预测精度和更短的推理时间,在功耗上具有显著优势.


关键词: 类脑信息处理,  脉冲神经网络,  人工神经网络,  触觉感知,  神经形态,  滑动检测 
Fig.1 Experimental system for neuromorphic-based tactile slip perception
Fig.2 Diagram of GelStereo tactile sensing
Fig.3 Diagram of address-displacement representation
Fig.4 Diagram of address-event representation
Fig.5 Network architecture of SNN-ADR
Fig.6 Network architecture of SNN-AER
Fig.7 Data collection platform and some objects in dataset
响应模型 A P R F1
SRM 94.8 93.8 97.7 95.7
Loihi 93.8 94.3 95.3 94.8
Tab.1 Prediction results of SNN-ADR %
网络模型 A P R F1
CNN 97.5 96.5 97.4 96.9
CNN-LSTM 96.3 97.2 93.6 95.4
Tab.2 Prediction results of ANNs %
Fig.8 Accuracy curve with PSP kernel length
方法 设备 Pc/mW t/ms E/mJ
CNN TX2(Q) 129.50 20.39 2.64
CNN-LSTM TX2(Q) 163.18 37.96 6.19
CNN TX2(N) 957.34 8.44 8.08
CNN-LSTM TX2(N) 963.26 15.28 14.72
SNN-ADR Loihi 23.40 7.913 0.185
Tab.3 Power consumption and inference speed results for different methods/devices
响应模型 A P R F1
SRM 97.42 98.43 97.03 97.72
Loihi 93.80 98.19 90.85 94.37
Tab.4 Prediction results of SNN-AER %
[1]   DAYAN P, ABBOTT L F. Theoretical neuroscience: computational and mathematical modeling of neural systems [M]. Cambridge: MIT Press, 2005.
[2]   BOOTH V, BOSE A Neural mechanisms for generating rate and temporal codes in model CA3 pyramidal cells[J]. Journal of Neurophysiology, 2001, 85 (6): 2432- 2445
doi: 10.1152/jn.2001.85.6.2432
[3]   SEVERSON K S, XU D, VAN DE LOO M, et al Active touch and self-motion encoding by merkel cell-associated afferents[J]. Neuron, 2017, 94 (3): 666- 676
doi: 10.1016/j.neuron.2017.03.045
[4]   DAVIES M, SRINIVASA N, LIN T H, et al Loihi: a neuromorphic manycore processor with on-chip learning[J]. IEEE Micro, 2018, 38 (1): 82- 99
doi: 10.1109/MM.2018.112130359
[5]   吴朝晖 类脑研究: 为人类构建超级大脑[J]. 浙江大学学报: 工学版, 2020, 54 (3): 425- 426
WU Zhao-hui Cybrain: building superbrain for humans[J]. Journal of Zhejiang University: Engineering Science, 2020, 54 (3): 425- 426
[6]   CUI S, WEI J, LI X, et al Generalized visual-tactile transformer network for slip detection[J]. IFAC-PapersOnLine, 2020, 53 (2): 9529- 9534
doi: 10.1016/j.ifacol.2020.12.2430
[7]   JAMES J W, PESTELL N, LEPORA N F Slip detection with a biomimetic tactile sensor[J]. IEEE Robotics and Automation Letters, 2018, 3 (4): 3340- 3346
doi: 10.1109/LRA.2018.2852797
[8]   ZHANG Y, KAN Z, TSE Y A, et al. FingerVision tactile sensor design and slip detection using convolutional LSTM network [EB/OL]. [2022-10-13]. https://arxiv.org/abs/1810.02653.
[9]   WARD-CHERRIER B, ROJAS N, LEPORA N F Model-free precise in-hand manipulation with a 3D-printed tactile gripper[J]. IEEE Robotics and Automation Letters, 2017, 2 (4): 2056- 2063
doi: 10.1109/LRA.2017.2719761
[10]   LUO S, BIMBO J, DAHIYA R, et al Robotic tactile perception of object properties: a review[J]. Mechatronics, 2017, 48: 54- 67
doi: 10.1016/j.mechatronics.2017.11.002
[11]   CHI C, SUN X, XUE N, et al Recent progress in technologies for tactile sensors[J]. Sensors, 2018, 18 (4): 948
doi: 10.3390/s18040948
[12]   YUAN W, DONG S, ADELSON E H Gelsight: high-resolution robot tactile sensors for estimating geometry and force[J]. Sensors, 2017, 17 (12): 2762
doi: 10.3390/s17122762
[13]   DONLON E, DONG S, LIU M, et al. GelSlim: a high-resolution, compact, robust, and calibrated tactile-sensing finger [C]// 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems. Madrid: IEEE, 2018: 1927-1934.
[14]   MA D, DONLON E, DONG S, et al. Dense tactile force estimation using GelSlim and inverse FEM [C]// 2019 International Conference on Robotics and Automation. Montreal: IEEE, 2019: 5418-5424.
[15]   TAYLOR I H, DONG S, RODRIGUEZ A. GelSlim 3.0: high-resolution measurement of shape, force and slip in a compact tactile-sensing finger [C]// 2022 International Conference on Robotics and Automation. Philadelphia: IEEE, 2022: 10781-10787.
[16]   LEPORA N F Soft biomimetic optical tactile sensing with the TacTip: a review[J]. IEEE Sensors Journal, 2021, 21 (19): 21131- 21143
doi: 10.1109/JSEN.2021.3100645
[17]   YAMAGUCHI A, ATKESON C G. Implementing tactile behaviors using FingerVision [C]// 2017 IEEE-RAS 17th International Conference on Humanoid Robotics. Birmingham: IEEE, 2017: 241-248.
[18]   CUI S, WANG R, HU J, et al In-hand object localization using a novel high-resolution visuotactile sensor[J]. IEEE Transactions on Industrial Electronics, 2021, 69 (6): 6015- 6025
[19]   CUI S, WANG R, HU J, et al Self-supervised contact geometry learning by GelStereo visuotactile sensing[J]. IEEE Transactions on Instrumentation and Measurement, 2021, 71: 1- 9
[20]   ABAD A C, RANASINGHE A Visuotactile sensors with emphasis on gelsight sensor: a review[J]. IEEE Sensors Journal, 2020, 20 (14): 7628- 7638
doi: 10.1109/JSEN.2020.2979662
[21]   LEE W W, TAN Y J, YAO H, et al A neuro-inspired artificial peripheral nervous system for scalable electronic skins[J]. Science Robotics, 2019, 4 (32): eaax2198
doi: 10.1126/scirobotics.aax2198
[22]   WARD-CHERRIER B, PESTELL N, LEPORA N F. Neurotac: a neuromorphic optical tactile sensor applied to texture recognition [C]// 2020 IEEE International Conference on Robotics and Automation. Paris: IEEE, 2020: 2654-2660.
[23]   KUMAGAI K, SHIMONOMURA K. Event-based tactile image sensor for detecting spatio-temporal fast phenomena in contacts [C]// 2019 IEEE World Haptics Conference. Tokyo: IEEE, 2019: 343-348.
[24]   RUECKAUER B, LUNGU I A, HU Y, et al Conversion of continuous-valued deep networks to efficient event-driven networks for image classification[J]. Frontiers in Neuroscience, 2017, 11: 682
doi: 10.3389/fnins.2017.00682
[25]   SENGUPTA A, YE Y, WANG R, et al Going deeper in spiking neural networks: VGG and residual architectures[J]. Frontiers in Neuroscience, 2019, 13: 95
doi: 10.3389/fnins.2019.00095
[26]   NEFTCI E O, MOSTAFA H, ZENKE F Surrogate gradient learning in spiking neural networks: bringing the power of gradient-based optimization to spiking neural networks[J]. IEEE Signal Processing Magazine, 2019, 36 (6): 51- 63
doi: 10.1109/MSP.2019.2931595
[27]   SHRESTHA S B, ORCHARD G. Slayer: spike layer error reassignment in time [EB/OL]. [2022-10-13]. https://proceedings.neurips.cc/paper/2018/hash/82f2b308c3b01637c607ce05f52a2fed-Abstract.html.
[28]   SEE H H, LIM B, LI S, et al. ST-MNIST: the spiking tactile MNIST neuromorphic dataset [EB/OL]. [2022-10-13]. https://arxiv.org/abs/2005.04319.
[29]   TAUNYAZOV T, SNG W, SEE H H, et al. Event-driven visual-tactile sensing and learning for robots [EB/OL]. [2022-10-13]. https://arxiv.org/abs/2009.07083.
[30]   GU F, SNG W, TAUNYAZOV T, et al. Tactilesgnet: a spiking graph neural network for event-based tactile object recognition [C]// 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems. Las Vegas: IEEE, 2020: 9876-9882.
[31]   BARTOLOZZI C, ROS P M, DIOTALEVI F, et al. Event-driven encoding of off-the-shelf tactile sensors for compression and latency optimisation for robotic skin [C]// 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems. Vancouver: IEEE, 2017: 166-173.
[32]   YI Z, ZHANG Y, PETERS J Biomimetic tactile sensors and signal processing with spike trains: a review[J]. Sensors and Actuators A: Physical, 2018, 269: 41- 52
doi: 10.1016/j.sna.2017.09.035
[33]   YI Z, XU T, GUO S, et al Tactile surface roughness categorization with multineuron spike train distance[J]. IEEE Transactions on Automation Science and Engineering, 2020, 18 (4): 1835- 1845
[34]   DABBOUS A, MASTELLA M, NATARAJAN A, et al. Artificial bio-inspired tactile receptive fields for edge orientation classification [C]// 2021 IEEE International Symposium on Circuits and Systems. Daegu: IEEE, 2021: 1-5.
[35]   TAUNYAZOV T, CHUA Y, GAO R, et al. Fast texture classification using tactile neural coding and spiking neural network [C]// 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems. Las Vegas: IEEE, 2020: 9890-9895.
[36]   LI J, DONG S, ADELSON E. Slip detection with combined tactile and visual information [C]// 2018 IEEE International Conference on Robotics and Automation. Brisbane: IEEE, 2018: 7772-7777.
[37]   ZHANG Y, YUAN W, KAN Z, et al. Towards learning to detect and predict contact events on vision-based tactile sensors [C]// Conference on Robot Learning. [S.l.]: MIT, 2020: 1395-1404.
[38]   BOAHEN K A Point-to-point connectivity between neuromorphic chips using address events[J]. IEEE Transactions on Circuits and Systems II: Analog and Digital Signal Processing, 2000, 47 (5): 416- 434
doi: 10.1109/82.842110
[39]   GERSTNER W Time structure of the activity in neural network models[J]. Physical Review E, 1995, 51 (1): 738
doi: 10.1103/PhysRevE.51.738
[40]   崔少伟, 魏俊杭, 王睿, 等 基于视触融合的机器人抓取滑动检测[J]. 华中科技大学学报: 自然科学版, 2020, 48 (1): 98- 102
CUI Shao-wei, WEI Jun-hang, WANG Rui, et al Robotic grasp slip detection based on visual-tactile fusion[J]. Journal of Huazhong University of Science and Technology: Natural Science Edition, 2020, 48 (1): 98- 102
[1] Mei-jiang GUI,Xiao-hu ZHOU,Xiao-liang XIE,Shi-qi LIU,Hao LI,Jin-li WANG,Zeng-guang HOU. Analysis and simulation of magnetic field for robot tactile perception[J]. Journal of ZheJiang University (Engineering Science), 2022, 56(6): 1144-1151.
[2] Yi-cun WANG,Jiang-kuan XING,Kun LUO,Hai-ou WANG,Jian-ren FAN. Solving combustion chemical differential equations via physics-informed neural network[J]. Journal of ZheJiang University (Engineering Science), 2022, 56(10): 2084-2092.
[3] Hong CHENG,Jia-jie HU,Yong LIU,Yuan-qing YE. Three-dimensional reconstruction algorithm based on fusion of transport of intensity equation and neural network[J]. Journal of ZheJiang University (Engineering Science), 2021, 55(4): 658-664.
[4] Fang LIU,Zhen WANG,Rui-di LIU,Kai WANG. Short-term forecasting method of wind power generation based on BP neural network with combined loss function[J]. Journal of ZheJiang University (Engineering Science), 2021, 55(3): 594-600.
[5] Zheng-wei GAO,Tai JIN,Chang-cheng SONG,Kun LUO,Jian-ren FAN. Application of artificial neural networks to supercritical flamelet model[J]. Journal of ZheJiang University (Engineering Science), 2021, 55(10): 1968-1977.
[6] HUANG Zeng, WANG Rui, ZHAO Yu, WEI Zhen-lei. Susceptibility assessment of landslides triggered by buried fault earthquake[J]. Journal of ZheJiang University (Engineering Science), 2017, 51(11): 2136-2143.
[7] CHIANG Yen ming, ZHANG Jian quan, MING Yan. Flood forecasting by ensemble neural networks[J]. Journal of ZheJiang University (Engineering Science), 2016, 50(8): 1471-1478.
[8] LIU Kai1, ZHANG Li-min1, ZHANG Chao2. New hybrid sparse penalty mechanism of restricted Boltzmann machine[J]. Journal of ZheJiang University (Engineering Science), 2015, 49(6): 1070-1078.
[9] KE Hai-feng, YING Jing. Real-time license character recognition technology based on R-ELM[J]. Journal of ZheJiang University (Engineering Science), 2014, 48(7): 1209-1216.
[10] KE Hai-feng, YING Jing. Real-time license character recognition technology based on R-ELM[J]. Journal of ZheJiang University (Engineering Science), 2014, 48(2): 0-0.
[11] XIE Di, TONG Ruo-feng, TANG Min, FENG Yang. Distinguishable method for video fire detection[J]. Journal of ZheJiang University (Engineering Science), 2012, 46(4): 698-704.
[12] DUAN Bin, LIANG Jun, FEI Zheng-shun, YANG Min, HU Bin. Nonlinear semi-parametric modeling mothed based on GA-ANN[J]. Journal of ZheJiang University (Engineering Science), 2011, 45(6): 977-983.
[13] LIN Sha-Pu, CHEN Guo-Jiang. Intelligent hybrid forecasting technique for short-term traffic flow[J]. Journal of ZheJiang University (Engineering Science), 2010, 44(8): 1473-1478.