[1] 洪巍, 李敏. 文本情感分析方法研究综述[J]. 计算机工程与科学, 2019, 41(4):180-187
[2] Kim Y. Convolutional neural networks for sentence classification[C]//Proc of the 2014 Conf on Empirical Methods in Natural Natural Language Processing. Stroudsburg,PA: ACL,2014:1746-1751
[3] Kalchbrenner N, Grefenstette E, Blunsom P. A convolutional neural network for modelling sentences[C]//Proc of the 52nd Annual Meeting of the Association for Computational Linguistics. Stroudsburg,PA: ACL,2014:655–665
[4] 张谦, 高章敏, 刘嘉勇. 基于Word2vec的微博短文本分类研究[J]. 信息网络安全, 2017,17(1):57-62
[5] Baziotis C, Pelekis N, Doulkeridis C. Datastories at semeval-2017 task 4: Deep lstm with attention for message-level and topic-based sentiment analysis[C]// Proc of the 11th Int Workshop on Semantic Evaluation. Stroudsburg,PA: ACL,2017:747–754
[6] Kumar A, Kawahara D, Kurohashi S. Knowledge-enriched two-layered attention network for sentiment analysis[C]// Proc of the 2018 Conf of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers). Stroudsburg,PA: ACL,2018:253–258
[7] 冯兴杰, 张志伟, 史金钏. 基于卷积神经网络和注意力模型的文本情感分析[J]. 计算机应用研究, 2018,35(5):1434-1436
[8] 胡荣磊, 芮璐, 齐筱, 等. 基于循环神经网络和注意力模型的文本情感分析[J]. 计算机应用研究, 2019,36(11): 3282-3285
[9] Bengio Y, Ducharme R, Vincent P, et al. A neural probabilistic language model[J]. Journal of machine learning research, 2003,3(2): 1137-1155
[10] Mikolov T, Chen K, Corrado G, et al. Efficient estimation of word representations in vector space [EB/OL]. (2013-01-16) [2020-01-17]. http://arxiv.org/abs/1301.3781
[11] Mikolov T, Sutskever I, Chen K, et al. Distributed representations of words and phrases and their compositionality[C]// Proc of the 26th Int Conf on Neural Information Processing Systems. Nevada: NIPS, 2013:3111-3119
[12] Pennington J, Socher R, Manning C. GloVe: Global vectors for word representation[C]// Proc of the 2014 Conf on Empirical Methods in Natural Language Processing. Stroudsburg,PA: ACL,2014:1532–1543
[13] Devlin J, Chang M W, Lee K, et al. BERT: Pre-training of deep bidirectional transformers for language understanding[EB/OL]. (2018-10-11)[2020-01-17]. https://arxiv.org/abs/1810.04805
[14] Vaswani A, Shazeer N, Parmar N, et al. Attention is all you need[C]// Proc of the 30th Int Conf on Neural Information Processing Systems. California:NIPS, 2017:5998-6008
[15] Socher R, Perelygin A, Wu J, et al. Recursive deep models for semantic compositionality over a sentiment treebank[C]// Proc of the 2013 Conf on Empirical Methods in Natural Language Processing. Stroudsburg,PA: ACL,2013:1631–1642
[16] Tai K S, Socher R, Manning C D. Improved semantic representations from tree-structured long short-term memory networks[C]// Proc of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th Int Joint Conf on Natural Language Processing (Volume 1: Long Papers). Stroudsburg,PA: ACL,2015:1556–1566
[17] Zhou P, Qi Z, Zheng S, et al. Text classification improved by integrating bidirectional LSTM with two-dimensional max pooling[C]//Proc of the 26th Int Conf on Computational Linguistics. Stroudsburg,PA: ACL,2016:3485–3495
|