1. State Key Laboratory of Multimodal Artificial Intelligence Systems, Institute of Automation, Chinese Academy of Sciences, Beijing 100190, China 2. Intel Labs China, Beijing 100190, China 3. School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing 100049, China
A tactile perception platform was constructed by focusing on neuromorphic-based tactile perception in order to analyze slip detection methods based on spiking neural networks (SNNs). The GelStereo sensor was used to capture the tactile information flow in the form of markers displacement field on the contact surface. Address-displacement representation (ADR) and address-event representation (AER) were analyzed to pulse-encode the markers displacement field. Slip detection networks were proposed based on the spike response model (SRM). The proposed network was deployed in the neuromorphic hardware - Intel Loihi chip. The experimental results show that the proposed SNN method with ADR achieved 94.8% accuracy and 95.7% F1 score in SRM, 93.8% accuracy and 94.8% F1 score in Loihi model (a specialized SRM for Loihi hardware). The well-trained SNNs for slip detection achieved comparable performance with artificial neural networks (ANNs) both on accuracy and F1 score with less inference time, and showed a significant advantage in power consumption.
Fig.1Experimental system for neuromorphic-based tactile slip perception
Fig.2Diagram of GelStereo tactile sensing
Fig.3Diagram of address-displacement representation
Fig.4Diagram of address-event representation
Fig.5Network architecture of SNN-ADR
Fig.6Network architecture of SNN-AER
Fig.7Data collection platform and some objects in dataset
响应模型
A
P
R
F1
SRM
94.8
93.8
97.7
95.7
Loihi
93.8
94.3
95.3
94.8
Tab.1Prediction results of SNN-ADR %
网络模型
A
P
R
F1
CNN
97.5
96.5
97.4
96.9
CNN-LSTM
96.3
97.2
93.6
95.4
Tab.2Prediction results of ANNs %
Fig.8Accuracy curve with PSP kernel length
方法
设备
Pc/mW
t/ms
E/mJ
CNN
TX2(Q)
129.50
20.39
2.64
CNN-LSTM
TX2(Q)
163.18
37.96
6.19
CNN
TX2(N)
957.34
8.44
8.08
CNN-LSTM
TX2(N)
963.26
15.28
14.72
SNN-ADR
Loihi
23.40
7.913
0.185
Tab.3Power consumption and inference speed results for different methods/devices
响应模型
A
P
R
F1
SRM
97.42
98.43
97.03
97.72
Loihi
93.80
98.19
90.85
94.37
Tab.4Prediction results of SNN-AER %
[1]
DAYAN P, ABBOTT L F. Theoretical neuroscience: computational and mathematical modeling of neural systems [M]. Cambridge: MIT Press, 2005.
[2]
BOOTH V, BOSE A Neural mechanisms for generating rate and temporal codes in model CA3 pyramidal cells[J]. Journal of Neurophysiology, 2001, 85 (6): 2432- 2445
doi: 10.1152/jn.2001.85.6.2432
[3]
SEVERSON K S, XU D, VAN DE LOO M, et al Active touch and self-motion encoding by merkel cell-associated afferents[J]. Neuron, 2017, 94 (3): 666- 676
doi: 10.1016/j.neuron.2017.03.045
[4]
DAVIES M, SRINIVASA N, LIN T H, et al Loihi: a neuromorphic manycore processor with on-chip learning[J]. IEEE Micro, 2018, 38 (1): 82- 99
doi: 10.1109/MM.2018.112130359
[5]
吴朝晖 类脑研究: 为人类构建超级大脑[J]. 浙江大学学报: 工学版, 2020, 54 (3): 425- 426 WU Zhao-hui Cybrain: building superbrain for humans[J]. Journal of Zhejiang University: Engineering Science, 2020, 54 (3): 425- 426
[6]
CUI S, WEI J, LI X, et al Generalized visual-tactile transformer network for slip detection[J]. IFAC-PapersOnLine, 2020, 53 (2): 9529- 9534
doi: 10.1016/j.ifacol.2020.12.2430
[7]
JAMES J W, PESTELL N, LEPORA N F Slip detection with a biomimetic tactile sensor[J]. IEEE Robotics and Automation Letters, 2018, 3 (4): 3340- 3346
doi: 10.1109/LRA.2018.2852797
[8]
ZHANG Y, KAN Z, TSE Y A, et al. FingerVision tactile sensor design and slip detection using convolutional LSTM network [EB/OL]. [2022-10-13]. https://arxiv.org/abs/1810.02653.
[9]
WARD-CHERRIER B, ROJAS N, LEPORA N F Model-free precise in-hand manipulation with a 3D-printed tactile gripper[J]. IEEE Robotics and Automation Letters, 2017, 2 (4): 2056- 2063
doi: 10.1109/LRA.2017.2719761
[10]
LUO S, BIMBO J, DAHIYA R, et al Robotic tactile perception of object properties: a review[J]. Mechatronics, 2017, 48: 54- 67
doi: 10.1016/j.mechatronics.2017.11.002
[11]
CHI C, SUN X, XUE N, et al Recent progress in technologies for tactile sensors[J]. Sensors, 2018, 18 (4): 948
doi: 10.3390/s18040948
[12]
YUAN W, DONG S, ADELSON E H Gelsight: high-resolution robot tactile sensors for estimating geometry and force[J]. Sensors, 2017, 17 (12): 2762
doi: 10.3390/s17122762
[13]
DONLON E, DONG S, LIU M, et al. GelSlim: a high-resolution, compact, robust, and calibrated tactile-sensing finger [C]// 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems. Madrid: IEEE, 2018: 1927-1934.
[14]
MA D, DONLON E, DONG S, et al. Dense tactile force estimation using GelSlim and inverse FEM [C]// 2019 International Conference on Robotics and Automation. Montreal: IEEE, 2019: 5418-5424.
[15]
TAYLOR I H, DONG S, RODRIGUEZ A. GelSlim 3.0: high-resolution measurement of shape, force and slip in a compact tactile-sensing finger [C]// 2022 International Conference on Robotics and Automation. Philadelphia: IEEE, 2022: 10781-10787.
[16]
LEPORA N F Soft biomimetic optical tactile sensing with the TacTip: a review[J]. IEEE Sensors Journal, 2021, 21 (19): 21131- 21143
doi: 10.1109/JSEN.2021.3100645
[17]
YAMAGUCHI A, ATKESON C G. Implementing tactile behaviors using FingerVision [C]// 2017 IEEE-RAS 17th International Conference on Humanoid Robotics. Birmingham: IEEE, 2017: 241-248.
[18]
CUI S, WANG R, HU J, et al In-hand object localization using a novel high-resolution visuotactile sensor[J]. IEEE Transactions on Industrial Electronics, 2021, 69 (6): 6015- 6025
[19]
CUI S, WANG R, HU J, et al Self-supervised contact geometry learning by GelStereo visuotactile sensing[J]. IEEE Transactions on Instrumentation and Measurement, 2021, 71: 1- 9
[20]
ABAD A C, RANASINGHE A Visuotactile sensors with emphasis on gelsight sensor: a review[J]. IEEE Sensors Journal, 2020, 20 (14): 7628- 7638
doi: 10.1109/JSEN.2020.2979662
[21]
LEE W W, TAN Y J, YAO H, et al A neuro-inspired artificial peripheral nervous system for scalable electronic skins[J]. Science Robotics, 2019, 4 (32): eaax2198
doi: 10.1126/scirobotics.aax2198
[22]
WARD-CHERRIER B, PESTELL N, LEPORA N F. Neurotac: a neuromorphic optical tactile sensor applied to texture recognition [C]// 2020 IEEE International Conference on Robotics and Automation. Paris: IEEE, 2020: 2654-2660.
[23]
KUMAGAI K, SHIMONOMURA K. Event-based tactile image sensor for detecting spatio-temporal fast phenomena in contacts [C]// 2019 IEEE World Haptics Conference. Tokyo: IEEE, 2019: 343-348.
[24]
RUECKAUER B, LUNGU I A, HU Y, et al Conversion of continuous-valued deep networks to efficient event-driven networks for image classification[J]. Frontiers in Neuroscience, 2017, 11: 682
doi: 10.3389/fnins.2017.00682
[25]
SENGUPTA A, YE Y, WANG R, et al Going deeper in spiking neural networks: VGG and residual architectures[J]. Frontiers in Neuroscience, 2019, 13: 95
doi: 10.3389/fnins.2019.00095
[26]
NEFTCI E O, MOSTAFA H, ZENKE F Surrogate gradient learning in spiking neural networks: bringing the power of gradient-based optimization to spiking neural networks[J]. IEEE Signal Processing Magazine, 2019, 36 (6): 51- 63
doi: 10.1109/MSP.2019.2931595
[27]
SHRESTHA S B, ORCHARD G. Slayer: spike layer error reassignment in time [EB/OL]. [2022-10-13]. https://proceedings.neurips.cc/paper/2018/hash/82f2b308c3b01637c607ce05f52a2fed-Abstract.html.
[28]
SEE H H, LIM B, LI S, et al. ST-MNIST: the spiking tactile MNIST neuromorphic dataset [EB/OL]. [2022-10-13]. https://arxiv.org/abs/2005.04319.
[29]
TAUNYAZOV T, SNG W, SEE H H, et al. Event-driven visual-tactile sensing and learning for robots [EB/OL]. [2022-10-13]. https://arxiv.org/abs/2009.07083.
[30]
GU F, SNG W, TAUNYAZOV T, et al. Tactilesgnet: a spiking graph neural network for event-based tactile object recognition [C]// 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems. Las Vegas: IEEE, 2020: 9876-9882.
[31]
BARTOLOZZI C, ROS P M, DIOTALEVI F, et al. Event-driven encoding of off-the-shelf tactile sensors for compression and latency optimisation for robotic skin [C]// 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems. Vancouver: IEEE, 2017: 166-173.
[32]
YI Z, ZHANG Y, PETERS J Biomimetic tactile sensors and signal processing with spike trains: a review[J]. Sensors and Actuators A: Physical, 2018, 269: 41- 52
doi: 10.1016/j.sna.2017.09.035
[33]
YI Z, XU T, GUO S, et al Tactile surface roughness categorization with multineuron spike train distance[J]. IEEE Transactions on Automation Science and Engineering, 2020, 18 (4): 1835- 1845
[34]
DABBOUS A, MASTELLA M, NATARAJAN A, et al. Artificial bio-inspired tactile receptive fields for edge orientation classification [C]// 2021 IEEE International Symposium on Circuits and Systems. Daegu: IEEE, 2021: 1-5.
[35]
TAUNYAZOV T, CHUA Y, GAO R, et al. Fast texture classification using tactile neural coding and spiking neural network [C]// 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems. Las Vegas: IEEE, 2020: 9890-9895.
[36]
LI J, DONG S, ADELSON E. Slip detection with combined tactile and visual information [C]// 2018 IEEE International Conference on Robotics and Automation. Brisbane: IEEE, 2018: 7772-7777.
[37]
ZHANG Y, YUAN W, KAN Z, et al. Towards learning to detect and predict contact events on vision-based tactile sensors [C]// Conference on Robot Learning. [S.l.]: MIT, 2020: 1395-1404.
[38]
BOAHEN K A Point-to-point connectivity between neuromorphic chips using address events[J]. IEEE Transactions on Circuits and Systems II: Analog and Digital Signal Processing, 2000, 47 (5): 416- 434
doi: 10.1109/82.842110
[39]
GERSTNER W Time structure of the activity in neural network models[J]. Physical Review E, 1995, 51 (1): 738
doi: 10.1103/PhysRevE.51.738
[40]
崔少伟, 魏俊杭, 王睿, 等 基于视触融合的机器人抓取滑动检测[J]. 华中科技大学学报: 自然科学版, 2020, 48 (1): 98- 102 CUI Shao-wei, WEI Jun-hang, WANG Rui, et al Robotic grasp slip detection based on visual-tactile fusion[J]. Journal of Huazhong University of Science and Technology: Natural Science Edition, 2020, 48 (1): 98- 102