计算机技术、控制工程 |
|
|
|
|
基于双向编码表示转换的双模态软件分类模型 |
付晓峰1( ),陈威岐2,孙曜2,潘宇泽2 |
1. 杭州电子科技大学 计算机学院,浙江 杭州 310018 2. 杭州电子科技大学 自动化学院,浙江 杭州 310018 |
|
Bimodal software classification model based on bidirectional encoder representation from transformer |
Xiaofeng FU1( ),Weiqi CHEN2,Yao SUN2,Yuze PAN2 |
1. School of Computer Science, Hangzhou Dianzi University, Hangzhou 310018, China 2. School of Automation, Hangzhou Dianzi University, Hangzhou 310018, China |
引用本文:
付晓峰,陈威岐,孙曜,潘宇泽. 基于双向编码表示转换的双模态软件分类模型[J]. 浙江大学学报(工学版), 2024, 58(11): 2239-2246.
Xiaofeng FU,Weiqi CHEN,Yao SUN,Yuze PAN. Bimodal software classification model based on bidirectional encoder representation from transformer. Journal of ZheJiang University (Engineering Science), 2024, 58(11): 2239-2246.
链接本文:
https://www.zjujournals.com/eng/CN/10.3785/j.issn.1008-973X.2024.11.005
或
https://www.zjujournals.com/eng/CN/Y2024/V58/I11/2239
|
1 |
ARIAS-BARAHONA M X, ARTEAGA-ARTEAGA H B, OROZCO-ARIAS S, et al Requests classification in the customer service area for software companies using machine learning and natural language processing[J]. PeerJ Computer Science, 2023, 9: e1016
doi: 10.7717/peerj-cs.1016
|
2 |
CANEDO E D, MEDNES B C Software requirements classification using machine learning algorithms[J]. Entropy, 2020, 22 (9): 1057
doi: 10.3390/e22091057
|
3 |
TANJONG E E, CARVER D L. Improving impact and dependency analysis through software categorization methods [C]// 9th International Conference in Software Engineering Research and Innovation . San Diego: IEEE, 2021: 142-151.
|
4 |
DEVLIN J, CHANG M W, LEE K, et al. BERT: pre-training of deep bidirectional transformers for language understanding [C]// Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies . Minneapolis: ACL, 2019: 4171–4186.
|
5 |
FENG Z, GUO D, TANG D, et al. CodeBERT: a pre-trained model for programming and natural languages [C]// Findings of the Association for Computational Linguistics . [S. l. ]: ACL, 2020: 1536-1547.
|
6 |
VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need [C]// 31st Conference on Neural Information Processing Systems. Long Beach: ACM, 2017: 5998-6008.
|
7 |
PAN T, CHEN J, YE Z, et al A multi-head attention network with adaptive meta-transfer learning for RUL prediction of rocket engines[J]. Reliability Engineering and System Safety, 2022, 225: 108610
doi: 10.1016/j.ress.2022.108610
|
8 |
REZA S, FERREIRA M C, MACHADO J J M, et al A multi-head attention-based transformer model for traffic flow forecasting with a comparative analysis to recurrent neural networks[J]. Expert Systems with Applications, 2022, 202: 117275
doi: 10.1016/j.eswa.2022.117275
|
9 |
WEN Z, LIN W, WANG T, et al Distract your attention: multi-head cross attention network for facial expression recognition[J]. Biomimetics, 2023, 8 (2): 199
doi: 10.3390/biomimetics8020199
|
10 |
LAI T, CHENG L, WANG D, et al RMAN: relational multi-head attention neural network for joint extraction of entities and relations[J]. Applied Intelligence, 2022, 52 (3): 3132- 3142
doi: 10.1007/s10489-021-02600-2
|
11 |
ZHANG Y, JIN R, ZHOU Z H Understanding bag-of-words model: a statistical framework[J]. International Journal of Machine Learning and Cybernetics., 2010, 1: 43- 52
doi: 10.1007/s13042-010-0001-0
|
12 |
ZHANG W, YOSHIDA T, TANG X A comparative study of TF*IDF, LSI and multi-words for text classification[J]. Expert Systems with Applications, 2011, 38 (3): 2758- 2765
doi: 10.1016/j.eswa.2010.08.066
|
13 |
LAN Z, CHEN M, GOODMAN S, et al. ALBERT: a lite BERT for self-supervised learning of language representations [C]/ /IEEE Spoken Language Technology Workshop . Shenzhen: IEEE, 2021: 344-351.
|
14 |
LIU Y, OTT M, GOYAL N, et al. Roberta: a robustly optimized BERT pretraining approach [C]// China National Conference on Chinese Computational Linguistics. Hohhot: ACL, 2021: 1218–1227.
|
15 |
CUI Y, CHE W, LIU T, et al Pre-training with whole word masking for Chinese bert[J]. ACM Transactions on Audio, Speech, and Language Processing, 2021, 29: 3504- 3514
|
16 |
CHANG Y, KONG L, JIA K, et al. Chinese named entity recognition method based on BERT [C]// IEEE International Conference on Data Science and Computer Application . Dalian: IEEE, 2021: 294-299.
|
17 |
SERKAN K, ONUR A, OSAMA A, et al 1D convolutional neural networks and applications: a survey[J]. Mechanical Systems and Signal Processing, 2021, 151: 107398
doi: 10.1016/j.ymssp.2020.107398
|
18 |
TORREY L, SHAVLIK J. Transfer learning [M]// SORIA E, MARTIN R. Handbook of research on machine learning applications. Madison: IGI Global, 2010: 242-264.
|
19 |
GU Y, TINN R, CHENG H, et al. Domain-specific language model pretraining for biomedical natural language processing [J]. ACM Transactions on Computing for Healthcare , 2021, 3(1): 1-23.
|
20 |
MOON S, CHI S, IM S B Automated detection of contractual risk clauses from construction specifications using bidirectional encoder representations from transformers (BERT)[J]. Automation in Construction, 2022, 142: 104465
doi: 10.1016/j.autcon.2022.104465
|
21 |
SUN C, QIU X, XU Y, et al. How to fine-tune bert for text classification? [C]// Chinese Computational Linguistics: 18th China National Conference. Kunming: Springer, 2019: 194-206.
|
22 |
LINDSAY G W Convolutional neural networks as a model of the visual system: past, present, and future[J]. Journal of Cognitive Neuroscience, 2021, 33 (10): 2017- 2031
doi: 10.1162/jocn_a_01544
|
23 |
HAN Z, JIAN M, WANG G G ConvUNeXt: an efficient convolution neural network for medical image segmentation[J]. Knowledge-Based Systems, 2022, 253: 109512
doi: 10.1016/j.knosys.2022.109512
|
24 |
REIMERS N, GUREVYCH I. Sentence-BERT: sentence embeddings using Siamese BERT-networks [C]/ / Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing . Hong Kong: IEEE, 2019: 3982–3992.
|
|
Viewed |
|
|
|
Full text
|
|
|
|
|
Abstract
|
|
|
|
|
Cited |
|
|
|
|
|
Shared |
|
|
|
|
|
Discussed |
|
|
|
|