Please wait a minute...
浙江大学学报(工学版)  2022, Vol. 56 Issue (1): 26-35    DOI: 10.3785/j.issn.1008-973X.2022.01.003
计算机技术、信息与电子工程     
基于卷积姿态机的潜航员作业姿态识别方法
王憬鸾(),陈登凯*(),朱梦雅,王晗宇,孙意为
西北工业大学 陕西省工业设计工程实验室,陕西 西安 710072
Recognition method of submarine operation posture based on convolutional pose machine
Jing-luan WANG(),Deng-kai CHEN*(),Meng-ya ZHU,Han-yu WANG,Yi-wei SUN
Shaanxi Engineering Laboratory for Industrial Design, Northwestern Polytechnical University, Xi'an 710072, China
 全文: PDF(1382 KB)   HTML
摘要:

针对现有潜航员作业姿态识别分析方法中识别过程繁琐、识别精度低的问题,提出基于卷积姿态机的潜航员作业姿态识别分析方法. 对人体姿态特征进行结构化编码,构建空间及投影坐标系进行解析,定义肢体角度计算公式与肢体特殊状态判断流程. 通过搭建潜航员作业姿态识别算法,实现作业姿态RGB图像空间特征与纹理特征的提取,输出潜航员作业姿态关节点、肢体角度与状态数据. 通过采集潜航员作业姿态图像构建潜航员作业姿态样本数据集,对所提方法进行应用验证. 在算法测试中,识别算法的PCK指标值达到81.2%. 在应用验证实验中,算法识别关节点的平均准确率达到87.7%. 该方法在潜航员作业姿态识别分析上是可靠的,可以有效地识别与分析潜航员作业姿态中的危险因素.

关键词: 工业设计姿态识别潜航员作业姿态卷积姿态机姿态分析    
Abstract:

A new posture recognition and analysis method based on convolutional pose machines was proposed aiming at the problems of complicated recognition process and low recognition accuracy in the existing submariner’s operation posture recognition and analysis methods. The human body posture features were structured and coded, and the spatial and projected coordinate system were constructed to explain the human body posture. The calculation formulae of limb angle and the judging processes of special limb state were defined. The spatial and texture features of the RGB operation posture image can be extracted by building the submariner’s operation posture recognition algorithm. The joint points, limb angles and state data of the submariner’s operation posture can be output. The application of the proposed method was verified by the submariner’s operation posture sample data set constructed by collecting submarine operation posture image. The percentage of correct keypoints index value of the recognition algorithm reached 81.2% in the algorithm test. The average accuracy rate of the algorithm in identifying the joint points reached 87.7% in the application verification experiment. The experimental results show that the method is reliable in the recognition and analysis of the submariner’s operation posture, and can effectively identify and analyze the negative factors of the submariner’s operation posture.

Key words: industrial design    posture recognition    submariner’s operation posture    convolutional pose machine    posture analysis
收稿日期: 2021-05-12 出版日期: 2022-01-05
CLC:  TB 472  
基金资助: 中央高校基本科研业务费资助项目(31020190504007);陕西省特支计划领军人才资助项目(w099115)
通讯作者: 陈登凯     E-mail: wangjingluan1@163.com;chendengkai@nwpu.edu.cn
作者简介: 王憬鸾(1997—),女,博士生,从事计算机视觉、人机工程学研究. orcid.org/0000-0001-9408-3537. E-mail: wangjingluan1@163.com
服务  
把本文推荐给朋友
加入引用管理器
E-mail Alert
作者相关文章  
王憬鸾
陈登凯
朱梦雅
王晗宇
孙意为

引用本文:

王憬鸾,陈登凯,朱梦雅,王晗宇,孙意为. 基于卷积姿态机的潜航员作业姿态识别方法[J]. 浙江大学学报(工学版), 2022, 56(1): 26-35.

Jing-luan WANG,Deng-kai CHEN,Meng-ya ZHU,Han-yu WANG,Yi-wei SUN. Recognition method of submarine operation posture based on convolutional pose machine. Journal of ZheJiang University (Engineering Science), 2022, 56(1): 26-35.

链接本文:

https://www.zjujournals.com/eng/CN/10.3785/j.issn.1008-973X.2022.01.003        https://www.zjujournals.com/eng/CN/Y2022/V56/I1/26

编码 含义 编码 含义
1 头部 9 左手腕
2 颈部 10 左手指
3 右肩 11 右髋
4 右手肘 12 右膝
5 右手腕 13 右踝
6 右手指 14 左髋
7 左肩 15 左膝
8 左手肘 16 左踝
表 1  人体姿态关节点编码
图 1  空间坐标系与矢状面投影定义
编码 含义 编码 含义
1-2 颈部 2-9-12 腰部
2-3-4 右大臂 3-4-5 右小臂
4-5-6 右手腕 2-7-8 左大臂
7-8-9 左小臂 8-9-10 左手腕
11-12-13 右腿 14-15-16 左腿
表 2  人体肢体编码
图 2  腰部正常状态与扭转状态对比
图 3  双肩抬高的示意图
图 4  潜航员作业姿态识别算法的原理
图 5  潜航员作业姿态识别算法的结构
序号 输入 运算 参数 输出
1 原始图像 368×368×3
2 368×368×3 Conv.+ReLU k = 9, n = 128, s = 1 368×368×128
3 368×368×128 max_pool 2×2, s = 2 184×184×128
4 184×184×128 Conv.+ReLU k = 9, n = 128, s = 1 184×184×128
5 184×184×128 max_pool 2×2, s =2 92×92×128
6 92×92×128 Conv.+ReLU k = 9, n = 128, s = 1 92×92×128
7 92×92×128 max_pool 2×2, s = 2 46×46×128
8 46×46×128 Conv.+ReLU k = 5, n = 32, s = 1 46×46×32
9 46×46×32 Conv.+ReLU k = 9, n = 512, s = 1 46×46×512
10 46×46×512 Conv.+ReLU k = 1, n = 512, s = 1 46×46×512
11 46×46×512 Conv.+ReLU k = 1, n = 15, s = 1 46×46×15
表 3  算法g1-1卷积组的参数
序号 输入 运算 参数 输出
1 原始图像 368×368×3
2 368×368×3 Conv.+ReLU k = 9, n = 128, s = 1 368×368×128
3 368×368×128 max_pool 2×2, s = 2 184×184×128
4 184×184×128 Conv.+ReLU k = 9, n = 128, s = 1 184×184×128
5 184×184×128 max_pool 2×2, s = 2 92×92×128
6 92×92×128 Conv.+ReLU k = 9, n = 128, s = 1 92×92×128
7 92×92×128 max_pool 2×2, s = 2 46×46×128
8 46×46×128 Conv.+ReLU k = 5, n = 32, s = 1 46×46×32
9 46×46×32+
46×46×15+
46×46×1
Concat ( ) 46×46×48
10 46×46×48 Conv.+ReLU k = 11, n = 128, s = 1 46×46×128
11 46×46×128 Conv.+ReLU k = 11, n = 128, s = 1 46×46×128
12 46×46×128 Conv.+ReLU k = 11, n = 128, s = 1 46×46×128
13 46×46×128 Conv.+ReLU k = 1, n = 128, s = 1 46×46×128
14 46×46×128 Conv.+ReLU k = 1, n = 15, s = 1 46×46×15
表 4  算法g2-1与g2-2卷积组的参数
图 6  潜航员作业姿态识别算法的训练流程
图 7  6个阶段的人体关节点特征图
方法 头部 颈部 肩部 肘部 腕部 臀部 膝部 踝部 平均
FC[19] 75.8 71.3 67.3 64.3 60.4 63.7 60.1 55.7 64.8
DC[20] 83.6 74.6 73.9 71.1 66.0 70.9 64.1 59.4 70.5
CPN[21] 85.4 77.1 79.6 74.5 69.9 73.6 70.9 64.8 74.5
PAF[13] 82.6 75.8 75.9 71.3 68.5 68.9 65.9 60.9 71.2
文献[17] 91.8 82.1 80.6 70.9 82.1 81.9 74.6 80.2 80.5
文献[18] 90.6 84.3 84.5 75.0 78.0 83.8 78.9 79.2 81.8
本文方法 93.6 83.2 90.5 83.2 73.6 77.9 76.2 71.7 81.2
表 5  不同算法的PCK指标对比
图 8  载人潜水器模拟操作界面布局
图 9  利用潜航员作业姿态识别算法所生成的人体骨架
图 10  16个关节点识别准确性的得分
作业姿态 $\bar \theta (\sigma ) $/(°) ICC
颈部 腰部 右大臂 右小臂 右手腕 左大臂 左小臂 左手腕 右腿 左腿
1 1.2(1.3) 2.2(2.9) 14.5(10.7) 72.5(9.8) 1.8(1.2) 14.8(12.3) 70.0(5.8) 1.7(1.7) 105(8.0) 107.2(11.5) 0.966
2 8.6(6.8) 6.9(1.7) 29.4(9.7) 42.2(14.0) 6.3(4.5) 19.4(10.0) 65.0(4.3) 1.7(1.5) 103.1(10.1) 105.4(10.2) 0.951
3 ?6.7(3.9) 8.2(1.9) 39.3(7.4) 30.7(13.1) 7.7(5.9) 22.6(9.2) 67.7(10.2) 1.9(1.5) 104.4(10.5) 105.2(11.6) 0.953
4 ?17.9(10.0) 28.2(3.7) ?9.9(33.2) 74.4(32.0) 2.2(2.0) 77.7(11.2) 18.5(6.8) 8.4(4.5) 101.8(10.8) 104.6(10.3) 0.879
5 ?19.7(8.1) 35.9(6.1) ?11.4(4.4) 21.5(11.4) 16.6(4.5) 82.8(8.9) 18.1(9.0) 12.5(4.5) 67.3(21.9) 68.9(24.9) 0.888
6 7.7(1.5) 5.6(0.2) 21.2(7.6) 40.5(19.3) 22.2(2.0) 23.6(7.1) 34.7(10.5) 22.0(0.5) 108.2(12.7) 106.4(10.5) 0.944
7 20.4(10.0) 30.5(14.0) ?19.6(10.5) 119.5(22.9) 2.2(1.2) ?14.1(13.4) 112.6(18.3) 2.0(1.8) 106.1(9.2) 109.3(11.3) 0.950
8 ?4.9(9.6) 19.5(13.9) ?10.5(9.5) 91.4(9.7) 2.0(1.0) ?5.0(8.1) 87.6(9.4) 1.9(1.6) 104.8(2.4) 107.1(4.7) 0.971
9 ?16.9(9.9) 16.0(10.9) 8.1(9.4) 82.6(12.2) 3.3(2.0) 10.3(7.5) 82.1(8.6) 1.4(0.7) 104.2(1.0) 104.5(3.0) 0.976
10 ?6.3(3.2) 16.6(8.4) 17.0(17.7) 61.7(27.6) 9.5(3.1) 8.9(2.5) 76.9(11.9) 1.8(1.1) 108.4(4.9) 107.6(7.2) 0.921
11 ?17.4(3.9) 15.9(7.1) 16.8(15.4) 65.1(24.2) 10.7(5.7) 8.8(4.9) 71.6(15.6) 1.9(1.6) 106.2(5.4) 107.1(3.5) 0.933
12 ?14.8(5.3) 26.2(5.6) ?10.0(6.8) 82.8(15.2) 1.7(1.7) 75.6(11.1) 20.0(6.5) 9.1(3.7) 108.9(11.7) 109.4(12.7) 0.963
13 ?23.4(6.3) 24.1(6.8) ?10.4(4.4) 90.9(13.3) 1.9(1.8) 78.4(10.8) 16.8(8.2) 8.3(3.4) 103.7(10.2) 107.4(12.3) 0.965
14 ?14.0(11.1) 39.2(7.7) 10.7(10.2) 19(6.6) 16.7(5.3) 79.8(13.6) 23.1(14.3) 12.7(5.4) 65.0(26.0) 67.7(28.7) 0.832
15 ?24.9(9.0) 37.3(6.3) 10.3(4.6) 21(5.3) 16.1(4.7) 81.2(10.8) 21.2(11.1) 12.2(4.7) 68.0(27.9) 69.8(27.8) 0.874
ICC 0.935 0.923 0.976 0.974 0.853 0.881 0.956 0.887 0.852 0.819
表 6  潜航员作业姿态识别的肢体角度平均值(标准差)与ICC结果
图 11  作业视频中的肢体角度变化情况
1 CHEN D K, FAN Y, LI W H, et al Human reliability prediction in deep-sea sampling process of the manned submersible[J]. Safety Science, 2019, 112 (2): 1- 8
2 张帅, 何卫平, 陈登凯, 等 载人潜水器舱室空间舒适性复合评估方法[J]. 哈尔滨工业大学学报, 2019, 51 (10): 83- 89
ZHANG Shuai, HE Wei-ping, CHEN Deng-kai, et al Compound evaluation method for the space comfort of manned submersible[J]. Journal of Harbin Institute of Technology, 2019, 51 (10): 83- 89
doi: 10.11918/j.issn.0367-6234.201810148
3 张宁, 李亚军, 段齐骏, 等 面向老年俯身作业的人机工程舒适性设计[J]. 浙江大学学报: 工学版, 2017, 51 (1): 95- 105
ZHANG Ning, LI Ya-jun, DUAN Qi-jun, et al Man-machine comfortable designing for bending operations of elderly[J]. Journal of Zhejiang University: Engineering Science, 2017, 51 (1): 95- 105
4 TRASK C, MATHIASSEN S E, ROSTAMI M, et al Observer variability in posture assessment from video recordings: The effect of partly visible periods[J]. Applied Ergonomics, 2017, 60 (4): 275- 281
5 朱德慰, 李志海, 吴镇炜. 基于异常行为监测的人机安全协作方法[EB/OL]. [2021-04-21]. http://kns.cnki.net/kcms/detail/11.5946.TP.20210329.1645.018.html.
ZHU De-wei, LI Zhi-hai, WU Zhen-wei. Abnormal behavior monitoring based method for safe human-robot collaboration [EB/OL]. [2021-04-21]. http://kns.cnki.net/kcms/detail/11.5946.TP.20210329.1645.018.html.
6 MANGHISI V M, UVA A E, FIORENTINO M, et al Real time RULA assessment using Kinect v2 sensor[J]. Applied Ergonomics, 2017, 65 (11): 481- 491
7 PLANTARD P, SHUM H P, PIERRES A S, et al Validation of an ergonomic assessment method using kinect data in real workplace conditions[J]. Applied Ergonomics, 2017, 65 (11): 562- 569
8 TOPLEY M, RICHARDS J G A comparison of currently available optoelectronic motion capture systems[J]. Journal of Biomechanics, 2020, 106 (6): 109820
9 YAN X Z, LI H, ANGUS R L, et al Wearable IMU-based real-time motion warning system for construction workers' musculoskeletal disorders prevention[J]. Automation in Construction, 2017, 74 (2): 2- 11
10 HUYNH-THE T, HUA C H, NGO T T, et al Image representation of pose-transition feature for 3D skeleton-based action recognition[J]. Information Sciences, 2020, 513 (5): 112- 126
11 袁公萍, 汤一平, 韩旺明, 等 基于深度卷积神经网络的车型识别方法[J]. 浙江大学学报: 工学版, 2018, 52 (4): 694- 702
YUAN Gong-ping, TANG Yi-ping, HAN Wang-ming, et al Vehicle category recognition based on deep convolutional neural network[J]. Journal of Zhejiang University: Engineering Science, 2018, 52 (4): 694- 702
12 LECUN Y, BENGIO Y, HINTON G Deep learning[J]. Nature, 2015, 521 (3): 436- 444
13 CAO Z, SIMOM T, WEI S E, et al. Realtime multi-person 2d pose estimation using part affinity fields [C]// Proceedings of the 2018 IEEE Conference on Computer Vision and Pattern Recognition. Salt Lake: IEEE, 2018: 1302-1310.
14 OBERWEGER M, WOHLHART P, LEPETIT V Hands deep in deep learning for hand pose estimation[J]. Computer Sciences, 2015, 24 (12): 21- 30
15 WEI S E, RAMAKRISHNA V, KANADE T, et al. Convolutional pose machines [C]// Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition. Las Vegas: IEEE, 2016: 4724-4732.
16 PROMSRI A, HAID T, FEDEROLF P How does lower limb dominance influence postural control movements during single leg stance?[J]. Human Movement Science, 2018, 58 (4): 165- 174
17 PFISTER T, CHARLES J, ZISSERMAN A. Flowing ConvNets for human pose estimation in videos [C]// Proceedings of the 2015 IEEE International Conference on Computer Vision. Santiago: IEEE, 2015: 1913-1921.
18 PISHCHULIN L, INSAFUTDINOV E, TANG S, et al. DeepCut: joint subset partition and labeling for multi person pose estimation [C]// Proceeding of the 2016 IEEE Conference on Computer Vision and Pattern Recognition. Las Vegas: IEEE, 2016: 4929-4937.
19 CHEN Y, WANG Z C, PENG Y X, et al. Cascaded pyramid network for multi-person pose estimation [C]// Proceedings of the 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Salt Lake: IEEE, 2018: 7103-7112.
20 SU K, YU D D, XU Z Q, et al. Multi-person pose estimation with enhanced channel-wise and spatial information [C]// Proceedings of the 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Long Beach: IEEE, 2019: 5674-5682.
21 ZHANG J, CHEN Z, TAO D. Towards high performance human keypoint detection [C]// Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Seattle: IEEE, 2020.
[1] 杨延璞,龚政,兰晨昕,雷紫荆,王欣蕊. 工业设计决策网络构建及其动态演化仿真[J]. 浙江大学学报(工学版), 2021, 55(12): 2298-2306.
[2] 陈健,莫蓉,余隋怀,初建杰,陈登凯,宫静. 云环境下众包产品造型设计方案多目标群体决策[J]. 浙江大学学报(工学版), 2019, 53(8): 1517-1524.
[3] 朱上上, 罗仕鉴, 应放天, 何基. 支持产品视觉识别的产品族设计DNA[J]. J4, 2010, 44(4): 715-721.
[4] 刘征, 孙守迁, 潘云鹤. 基于信息框架的设计师认知策略划分及应用[J]. J4, 2009, 43(5): 884-889.
[5] 孙凌云 孙守迁 许佳颖. 产品材料质感意象模型的建立及其应用[J]. J4, 2009, 43(2): 283-289.
[6] 罗仕鉴 朱上上 唐云开. 知识驱动的产品设计情境[J]. J4, 2008, 42(11): 1849-1855.