中文LLaMA&Alpaca大语言模型+本地CPU/GPU训练部署 (Chinese LLaMA & Alpaca LLMs)
Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
Pre-Trained Chinese XLNet(中文XLNet预训练模型)
Pre-trained Chinese ELECTRA(中文ELECTRA预训练模型)
Revisiting Pre-trained Models for Chinese Natural Language Processing (MacBERT)
A Span-Extraction Dataset for Chinese Machine Reading Comprehension (CMRC 2018)
PERT: Pre-training BERT with Permuted Language Model
A Sentence Cloze Dataset for Chinese Machine Reading Comprehension (CMRC 2019)
Empirical Evaluation on Current Neural Networks on Cloze-style Reading Comprehension
Cross-Lingual Machine Reading Comprehension (EMNLP 2019)