基于多头自注意力机制与MLP-Interactor的多模态情感分析
|
林宜山,左景,卢树华
|
Multimodal sentiment analysis based on multi-head self-attention mechanism and MLP-Interactor
|
Yishan LIN,Jing ZUO,Shuhua LU
|
|
表 3 在CMU-MOSEI数据集上和其他基准模型性能的对比结果 |
Tab.3 Comparison of performance on CMU-MOSEI dataset with other benchmark models |
|
模型 | MAE | corr | A2 | A7 | F1 | TFN[40](2017) | 0.593 | 0.700 | —/82.5 | 50.2 | —/82.1 | LMF[41](2018) | 0.623 | 0.677 | —/82.0 | 48.0 | —/82.1 | MulT[20](2019) | 0.580 | 0.703 | —/82.5 | 51.8 | —/82.3 | BBFN[11](2021) | 0.529 | 0.767 | —/86.2 | 54.8 | —/86.1 | Self-MM[24](2021) | 0.530 | 0.765 | 82.81/85.17 | — | 82.53/85.30 | MISA[7](2020) | 0.555 | 0.756 | 83.6/85.5 | 52.2 | 83.8/85.3 | MAG-BERT[43](2020) | 0.543 | 0.755 | 82.51/84.82 | — | 82.77/84.71 | CubeMLP[31](2022) | 0.529 | 0.760 | —/85.1 | 54.9 | —/84.5 | PS-Mixer[30](2023) | 0.537 | 0.765 | 83.1/86.1 | 53.0 | 83.1/86.1 | MTSA[44](2022) | 0.541 | 0.774 | —/85.5 | 52.9 | —/85.3 | AOBERT[10](2023) | 0.515 | 0.763 | 84.9/86.2 | 54.5 | 85.0/85.9 | TETFN[25](2023) TMRN[45](2023) | 0.551 0.535 | 0.748 0.762 | 84.25/85.18 83.39/86.19 | — 53.65 | 84.18/85.27 83.67/86.08 | MTAMW[46](2024) | 0.525 | 0.782 | 83.09/86.49 | 53.73 | 83.48/86.45 | MIBSA[47](2024) | 0.568 | 0.753 | —/86.70 | 52.40 | —/85.80 | FRDIN[48](2024) | 0.525 | 0.778 | 83.30/86.30 | 54.40 | 83.70/86.20 | CRNet[49](2024) | 0.541 | 0.771 | —/86.20 | 53.80 | —/86.10 | 本文模型 | 0.512 | 0.794 | 83.0/86.8 | 54.5 | 82.5/86.8 |
|
|
|