[1]Chang Weilin, Li Zhuohan, Lin Zi, et al. Vicuna: An opensource chatbot impressing gpt4 with 90%* chatgpt quality[EBOL]. 2023 [20250813]. https:vicuna.lmsys.org[2]Levine Y, Lenz B, Dagan O, et al. SenseBERT: Driving some sense into BERT[EBOL]. 2019 [20250813]. https:arxiv.orgabs1908.05646[3]Tian Hao, Gao Chenyu, Xiao Xinyan, et al. SKEP: Sentiment knowledge enhanced pretraining forsentiment analysis[EBOL]. 2020 [20250813]. https:arxiv.orgabs2005.05635[4]Lyu Bole, Chen Ling, Zhu Su, et al. LET: Linguistic knowledge enhanced graph transformer for Chinese short text matching[EBOL]. 2021 [20250813]. https:arxiv.orgabs2102.12671[5]Xu Yige, Zhu Chen, Wang Sihao, et al. Human parity on CommonsenseQA: Augmenting selfattention with external attention[EBOL]. 2021 [20250813]. https:arxiv.orgabs2112.03254[6]Grave E, Joulin A, Usunier N. Improving neural language models with a continuous cache[EBOL]. 2016 [20250813]. https:arxiv.orgabs1612.04426[7]Murty S, Koh P W, Liang P. ExpBERT: Representation engineering with natural language explanations[EBOL]. 2020 [20250813]. https:arxiv.orgabs2005.01932[8]Oguz B, Chen X, Karpukhin V, et al. UniKQA: Unified representations of structured and unstructured knowledge for opendomain question answering[EBOL]. 2020 [20250813]. https:arxiv.orgabs2012.14610[9]Lee J, Yoon W, Kim S, et al. BioBERT: A pretrained biomedical language representation model for biomedical text mining[J]. Bioinformatics, 2020, 36(4): 12341240[10]Zhao Yifan, Zhou Hongjin, Zhang Aiping, et al. Connecting embeddings based on multiplex relational graph attention networks for knowledge graph entity typing[J]. IEEE Trans on Knowledge and Data Engineering, 2022, 34(5): 21512162[11]Wang Jian, Wang Chengyu, Qiu Ming, et al. KECP: Knowledge enhanced contrastive prompting for fewshot extractive question answering[EBOL]. 2022 [20250813]. https:arxiv.orgabs2205.03071[12]Yamada I, Asai A, Shindo H, et al. LUKE: Deep contextualized entity representations with entityaware selfattention[EBOL]. 2020 [20250813]. https:arxiv.orgabs2010.01057 |