[1] Apple Siri. 2019. [2019-06-20], https://www.apple.com/siri
[2] Microsoft Cortana. 2019. [2019-06-20], https://www.microsoft.com /en-us/cortana
[3] Google Now. 2019. [2019-06-20], https://www.google.com/ landing/now
[4] Aho A V , Corasick M J . Efficient string matching: an aid to bibliographic search[J]. Communications of the ACM, 1975, 18(6):333-340
[5] Hopcroft J E , Ullman J D , Hopcroft J E . Introduction to automata theory, languages, and computation /[M]// Introduction to automata theory, languages, and computation. Addison-Wesley, 2001:68-72
[6] 薛朋强, 努尔布力, 吾守尔斯拉木. 基于网络文本信息的敏感信息过滤算法[J]. 计算机工程与设计, 2016, 37(9): 2447-2452
[7] Hochreiter S, Schmidhuber J. Long short-term memory[J]. Neural Computation, 1997, 9(8):1735-1780
[8] Mikolov T, Chen K, Corrado G, et al. Efficient estimation of word representations in vector space[J]. arXiv preprint arXiv:1301.3781, 2013:50-75
[9] Mikolov T , Sutskever I , Chen K , et al. Distributed Representations of Words and Phrases and their Compositionality[J]. Advances in Neural Information Processing Systems, 2013, 26:3111-3119
[10] Merity S , Keskar N S , Socher R . Regularizing and Optimizing LSTM Language Models[J]. 2017:234-244
[11] Chen P, Sun Z, Bing L, et al. Recurrent attention network on memory for aspect sentiment analysis[C]//Proceedings of the 2017 conference on empirical methods in natural language processing. 2017: 452-461
[12] Peters M E, Ammar W, Bhagavatula C, et al. Semi-supervised sequence tagging with bidirectional language models[J]. arXiv preprint arXiv:1705.00108, 2017:321-335
[13] He L, Lee K, Lewis M, et al. Deep semantic role labeling: What works and what’s next[C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2017: 473-483
[14] Lee K, He L, Lewis M, et al. End-to-end neural coreference resolution[J]. arXiv preprint arXiv:1707.07045, 2017:124-138
[15] Wang Y, Huang M, Zhao L. Attention-based LSTM for aspect-level sentiment classification[C]//Proceedings of the 2016 conference on empirical methods in natural language processing. 2016: 606-615
[16] Vaswani A, Shazeer N, Parmar N, et al. Attention is all you need[C]//Advances in neural information processing systems. 2017: 5998-6008
[17] Shaw P, Uszkoreit J, Vaswani A. Self-attention with relative position representations[J]. arXiv preprint arXiv:1803.02155, 2018:78-93
[18] Shen T, Zhou T, Long G, et al. Disan: Directional self-attention network for rnn/cnn-free language understanding[C]//Thirty-Second AAAI Conference on Artificial Intelligence. 2018:423-438
|