|
|
VR sickness estimation model based on 3D-ResNet two-stream network |
Wei QUAN(),Yong-qing CAI,Chao WANG,Jia SONG,Hong-kai SUN,Lin-xuan LI |
School of Computer Science and Technology, Changchun University of Science and Technology, Changchun 130013, China |
|
|
Abstract A VR sickness estimation method was proposed based on 3D two-stream convolutional neural network in order to accurately estimate VR sickness of VR video. Two sub-networks, which were appearance flow and motion flow, were constructed to mimic the two pathways of human visual system. 2D-ResNet50 model was changed to 3D model and a depth channel was added to learn the timing information in videos. 3D-CBAM attention module was introduced to improve the spatial correlation between channels of each frame. Then the key information was enhanced and redundant information was suppressed. The back-end fusion method was used to fuse the results of the two sub-networks. Experiments were conducted on a public video dataset. The experimental results showed that the accuracy of the appearance stream network and the motion stream network was improved by 1.7% and 3.6% respectively by introducing the attention mechanism. The accuracy of the fused two-stream network was improved to 93.7%, which outperformed other literatures.
|
Received: 20 August 2022
Published: 17 July 2023
|
|
Fund: 吉林省科技发展计划重点研发项目(20210203218SF) |
基于3D-ResNet双流网络的VR病评估模型
为了准确地评估VR视频引起不适的程度,提出基于3D双流卷积神经网络的VR病评估模型. 模仿人类视觉系统的2条通路,建立外观流和运动流2个子网络;将2D-ResNet50模型改为3D模型,增加一个深度通道,用以学习视频中的时序信息. 加入3D-CBAM注意力模块提高了各帧通道之间的空间关联,增强关键信息,去除冗余信息. 采用后端融合的方法,实现2个子网络结果的融合. 在公开视频数据集上进行实验验证,结果表明,通过3D-CBAM注意力模块引入注意力机制,使得外观流和运动流网络的VR病评估精度分别提升了1.7%和3.6%,与现有文献相比,融合的双流网络模型的精度得到了较大的提升,精度达到93.7%.
关键词:
虚拟现实,
VR病,
深度学习,
注意力机制,
3D卷积神经网络
|
|
[1] |
GUNA J, GERŠAK G, HUMAR I, et al Influence of video content type on users’ virtual reality sickness perception and physiological response[J]. Future Generation Computer Systems, 2019, 91: 263- 276
doi: 10.1016/j.future.2018.08.049
|
|
|
[2] |
MCCAULEY M E, SHARKEY T J Cybersickness: perception of self-motion in virtual environments[J]. Presence: Teleoperators and Virtual Environments, 1992, 1 (3): 311- 318
doi: 10.1162/pres.1992.1.3.311
|
|
|
[3] |
GUNA J, GERŠAK G, HUMAR I, et al Virtual reality sickness and challenges behind different technology and content settings[J]. Mobile Networks and Applications, 2020, 25 (4): 1436- 1445
doi: 10.1007/s11036-019-01373-w
|
|
|
[4] |
CHEN S, WENG D The temporal pattern of VR sickness during 7.5-h virtual immersion[J]. Virtual Reality, 2022, 26 (3): 817- 822
doi: 10.1007/s10055-021-00592-5
|
|
|
[5] |
KIM H G, LEE S, KIM S, et al. Towards a better understanding of VR sickness: physical symptom prediction for VR contents [C]// Proceedings of the AAAI Conference on Artificial Intelligence. Washington: AAAI, 2021: 836-844.
|
|
|
[6] |
LIM K, LEE J, WON K, et al A novel method for VR sickness reduction based on dynamic field of view processing[J]. Virtual Reality, 2021, 25 (2): 331- 340
doi: 10.1007/s10055-020-00457-3
|
|
|
[7] |
NG A K T, CHAN L K Y, LAU H Y K. A study of cybersickness and sensory conflict theory using a motion-coupled virtual reality system [C]// 2018 IEEE Conference on Virtual Reality and 3D User Interfaces. Reutlingen: IEEE, 2018: 643-644.
|
|
|
[8] |
KENNEDY R S, LANE N E, BERBAUM K S, et al Simulator sickness questionnaire: an enhanced method for quantifying simulator sickness[J]. International Journal of Aviation Psychology, 1993, 3 (3): 203- 220
doi: 10.1207/s15327108ijap0303_3
|
|
|
[9] |
KIM H G, BADDAR W J, LIM H, et al. Measurement of exceptional motion in VR video contents for VR sickness assessment using deep convolutional autoencoder [C]// 23rd ACM Conference on Virtual Reality Software and Technology. Gothenburg: ACM, 2017: 1-7.
|
|
|
[10] |
KIM H G, LIM H T, LEE S, et al Vrsa net: VR sickness assessment considering exceptional motion for 360 VR video[J]. IEEE Transactions on Image Processing, 2018, 28 (4): 1646- 1660
|
|
|
[11] |
LEE T M, YOON J C, LEE I K Motion sickness prediction in stereoscopic videos using 3D convolutional neural networks[J]. IEEE Transactions on Visualization and Computer Graphics, 2019, 25 (5): 1919- 1927
doi: 10.1109/TVCG.2019.2899186
|
|
|
[12] |
GOODALE M A, MILNER A D Separate visual pathways for perception and action[J]. Trends in Neurosciences, 1992, 15 (1): 20- 25
doi: 10.1016/0166-2236(92)90344-8
|
|
|
[13] |
HE K, ZHANG X, REN S, et al. Deep residual learning for image recognition [C]// Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. Las Vegas: IEEE, 2016: 770-778.
|
|
|
[14] |
HUANG C, WANG F, ZHANG R. Sign language recognition based on CBAM-ResNet [C]// Proceedings of the 2019 International Conference on Artificial Intelligence and Advanced Manufacturing. New York: ACM, 2019: 1-6.
|
|
|
[15] |
WOO S, PARK J, LEE J Y, et al. CBAM: convolutional block attention module [C]// Proceedings of the European Conference on Computer Vision. Munich: Springer, 2018: 3-19.
|
|
|
[16] |
权巍, 王超, 耿雪娜, 等 基于运动感知的VR体验舒适度研究[J]. 系统仿真学报, 2023, 35 (1): 169- 177 QUAN Wei, WANG Chao, GENG Xue-na, et al Research on VR experience comfort based on motion perception[J]. Journal of System Simulation, 2023, 35 (1): 169- 177
doi: 10.16182/j.issn1004731x.joss.21-0966
|
|
|
[17] |
KIM J, KIM W, AHN S, et al. Virtual reality sickness predictor: analysis of visual-vestibular conflict and VR contents[C]// Proceedings of 2018 10th International Conference on Quality of Multimedia Experience. Sardinia: IEEE, 2018: 1-6.
|
|
|
[18] |
PADMANABAN N, RUBAN T, SITZMANN V, et al Towards a machine-learning approach for sickness prediction in 360 stereoscopic videos[J]. IEEE Transactions on Visualization and Computer Graphics, 2018, 24 (4): 1594- 1603
doi: 10.1109/TVCG.2018.2793560
|
|
|
[19] |
HELL S, ARGYRIOU V. Machine learning architectures to predict motion sickness using a virtual reality rollercoaster simulation tool [C]// IEEE International Conference on Artificial Intelligence and Virtual Reality. New York: IEEE, 2018: 153-156.
|
|
|
[20] |
PORCINO T, RODRIGUES E O, SILVA A, et al. Using the gameplay and user data to predict and identify causes of cybersickness manifestation in virtual reality games [C]// IEEE 8th International Conference on Serious Games and Applications for Health. Vancouver: IEEE, 2020: 1-8.
|
|
|
[21] |
YILDIRIM C. A review of deep learning approaches to EEG-based classification of cybersickness in virtual reality [C]// 2020 IEEE International Conference on Artificial Intelligence and Virtual Reality. Utrecht: IEEE, 2020: 351-357.
|
|
|
[22] |
LI Y, LIU A, DING L Machine learning assessment of visually induced motion sickness levels based on multiple biosignals[J]. Biomedical Signal Processing and Control, 2019, 49: 202- 211
doi: 10.1016/j.bspc.2018.12.007
|
|
|
[23] |
SELVARAJU R R, COGSWELL M, DAS A, et al Grad-CAM: visual explanations from deep networks via gradient-based localization[J]. International Journal of Computer Vision, 2020, 128 (2): 336- 359
doi: 10.1007/s11263-019-01228-7
|
|
|
[24] |
GARCIA-AGUNDEZ A, REUTER C, BECKER H, et al Development of a classifier to determine factors causing cybersickness in virtual reality environments[J]. Games for Health Journal, 2019, 8 (6): 439- 444
doi: 10.1089/g4h.2019.0045
|
|
|
|
Viewed |
|
|
|
Full text
|
|
|
|
|
Abstract
|
|
|
|
|
Cited |
|
|
|
|
|
Shared |
|
|
|
|
|
Discussed |
|
|
|
|