|
|
Recognition method of submarine operation posture based on convolutional pose machine |
Jing-luan WANG( ),Deng-kai CHEN*( ),Meng-ya ZHU,Han-yu WANG,Yi-wei SUN |
Shaanxi Engineering Laboratory for Industrial Design, Northwestern Polytechnical University, Xi'an 710072, China |
|
|
Abstract A new posture recognition and analysis method based on convolutional pose machines was proposed aiming at the problems of complicated recognition process and low recognition accuracy in the existing submariner’s operation posture recognition and analysis methods. The human body posture features were structured and coded, and the spatial and projected coordinate system were constructed to explain the human body posture. The calculation formulae of limb angle and the judging processes of special limb state were defined. The spatial and texture features of the RGB operation posture image can be extracted by building the submariner’s operation posture recognition algorithm. The joint points, limb angles and state data of the submariner’s operation posture can be output. The application of the proposed method was verified by the submariner’s operation posture sample data set constructed by collecting submarine operation posture image. The percentage of correct keypoints index value of the recognition algorithm reached 81.2% in the algorithm test. The average accuracy rate of the algorithm in identifying the joint points reached 87.7% in the application verification experiment. The experimental results show that the method is reliable in the recognition and analysis of the submariner’s operation posture, and can effectively identify and analyze the negative factors of the submariner’s operation posture.
|
Received: 12 May 2021
Published: 05 January 2022
|
|
Fund: 中央高校基本科研业务费资助项目(31020190504007);陕西省特支计划领军人才资助项目(w099115) |
Corresponding Authors:
Deng-kai CHEN
E-mail: wangjingluan1@163.com;chendengkai@nwpu.edu.cn
|
基于卷积姿态机的潜航员作业姿态识别方法
针对现有潜航员作业姿态识别分析方法中识别过程繁琐、识别精度低的问题,提出基于卷积姿态机的潜航员作业姿态识别分析方法. 对人体姿态特征进行结构化编码,构建空间及投影坐标系进行解析,定义肢体角度计算公式与肢体特殊状态判断流程. 通过搭建潜航员作业姿态识别算法,实现作业姿态RGB图像空间特征与纹理特征的提取,输出潜航员作业姿态关节点、肢体角度与状态数据. 通过采集潜航员作业姿态图像构建潜航员作业姿态样本数据集,对所提方法进行应用验证. 在算法测试中,识别算法的PCK指标值达到81.2%. 在应用验证实验中,算法识别关节点的平均准确率达到87.7%. 该方法在潜航员作业姿态识别分析上是可靠的,可以有效地识别与分析潜航员作业姿态中的危险因素.
关键词:
工业设计,
姿态识别,
潜航员作业姿态,
卷积姿态机,
姿态分析
|
|
[1] |
CHEN D K, FAN Y, LI W H, et al Human reliability prediction in deep-sea sampling process of the manned submersible[J]. Safety Science, 2019, 112 (2): 1- 8
|
|
|
[2] |
张帅, 何卫平, 陈登凯, 等 载人潜水器舱室空间舒适性复合评估方法[J]. 哈尔滨工业大学学报, 2019, 51 (10): 83- 89 ZHANG Shuai, HE Wei-ping, CHEN Deng-kai, et al Compound evaluation method for the space comfort of manned submersible[J]. Journal of Harbin Institute of Technology, 2019, 51 (10): 83- 89
doi: 10.11918/j.issn.0367-6234.201810148
|
|
|
[3] |
张宁, 李亚军, 段齐骏, 等 面向老年俯身作业的人机工程舒适性设计[J]. 浙江大学学报: 工学版, 2017, 51 (1): 95- 105 ZHANG Ning, LI Ya-jun, DUAN Qi-jun, et al Man-machine comfortable designing for bending operations of elderly[J]. Journal of Zhejiang University: Engineering Science, 2017, 51 (1): 95- 105
|
|
|
[4] |
TRASK C, MATHIASSEN S E, ROSTAMI M, et al Observer variability in posture assessment from video recordings: The effect of partly visible periods[J]. Applied Ergonomics, 2017, 60 (4): 275- 281
|
|
|
[5] |
朱德慰, 李志海, 吴镇炜. 基于异常行为监测的人机安全协作方法[EB/OL]. [2021-04-21]. http://kns.cnki.net/kcms/detail/11.5946.TP.20210329.1645.018.html. ZHU De-wei, LI Zhi-hai, WU Zhen-wei. Abnormal behavior monitoring based method for safe human-robot collaboration [EB/OL]. [2021-04-21]. http://kns.cnki.net/kcms/detail/11.5946.TP.20210329.1645.018.html.
|
|
|
[6] |
MANGHISI V M, UVA A E, FIORENTINO M, et al Real time RULA assessment using Kinect v2 sensor[J]. Applied Ergonomics, 2017, 65 (11): 481- 491
|
|
|
[7] |
PLANTARD P, SHUM H P, PIERRES A S, et al Validation of an ergonomic assessment method using kinect data in real workplace conditions[J]. Applied Ergonomics, 2017, 65 (11): 562- 569
|
|
|
[8] |
TOPLEY M, RICHARDS J G A comparison of currently available optoelectronic motion capture systems[J]. Journal of Biomechanics, 2020, 106 (6): 109820
|
|
|
[9] |
YAN X Z, LI H, ANGUS R L, et al Wearable IMU-based real-time motion warning system for construction workers' musculoskeletal disorders prevention[J]. Automation in Construction, 2017, 74 (2): 2- 11
|
|
|
[10] |
HUYNH-THE T, HUA C H, NGO T T, et al Image representation of pose-transition feature for 3D skeleton-based action recognition[J]. Information Sciences, 2020, 513 (5): 112- 126
|
|
|
[11] |
袁公萍, 汤一平, 韩旺明, 等 基于深度卷积神经网络的车型识别方法[J]. 浙江大学学报: 工学版, 2018, 52 (4): 694- 702 YUAN Gong-ping, TANG Yi-ping, HAN Wang-ming, et al Vehicle category recognition based on deep convolutional neural network[J]. Journal of Zhejiang University: Engineering Science, 2018, 52 (4): 694- 702
|
|
|
[12] |
LECUN Y, BENGIO Y, HINTON G Deep learning[J]. Nature, 2015, 521 (3): 436- 444
|
|
|
[13] |
CAO Z, SIMOM T, WEI S E, et al. Realtime multi-person 2d pose estimation using part affinity fields [C]// Proceedings of the 2018 IEEE Conference on Computer Vision and Pattern Recognition. Salt Lake: IEEE, 2018: 1302-1310.
|
|
|
[14] |
OBERWEGER M, WOHLHART P, LEPETIT V Hands deep in deep learning for hand pose estimation[J]. Computer Sciences, 2015, 24 (12): 21- 30
|
|
|
[15] |
WEI S E, RAMAKRISHNA V, KANADE T, et al. Convolutional pose machines [C]// Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition. Las Vegas: IEEE, 2016: 4724-4732.
|
|
|
[16] |
PROMSRI A, HAID T, FEDEROLF P How does lower limb dominance influence postural control movements during single leg stance?[J]. Human Movement Science, 2018, 58 (4): 165- 174
|
|
|
[17] |
PFISTER T, CHARLES J, ZISSERMAN A. Flowing ConvNets for human pose estimation in videos [C]// Proceedings of the 2015 IEEE International Conference on Computer Vision. Santiago: IEEE, 2015: 1913-1921.
|
|
|
[18] |
PISHCHULIN L, INSAFUTDINOV E, TANG S, et al. DeepCut: joint subset partition and labeling for multi person pose estimation [C]// Proceeding of the 2016 IEEE Conference on Computer Vision and Pattern Recognition. Las Vegas: IEEE, 2016: 4929-4937.
|
|
|
[19] |
CHEN Y, WANG Z C, PENG Y X, et al. Cascaded pyramid network for multi-person pose estimation [C]// Proceedings of the 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Salt Lake: IEEE, 2018: 7103-7112.
|
|
|
[20] |
SU K, YU D D, XU Z Q, et al. Multi-person pose estimation with enhanced channel-wise and spatial information [C]// Proceedings of the 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Long Beach: IEEE, 2019: 5674-5682.
|
|
|
[21] |
ZHANG J, CHEN Z, TAO D. Towards high performance human keypoint detection [C]// Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Seattle: IEEE, 2020.
|
|
|
|
Viewed |
|
|
|
Full text
|
|
|
|
|
Abstract
|
|
|
|
|
Cited |
|
|
|
|
|
Shared |
|
|
|
|
|
Discussed |
|
|
|
|