unilm专题

[UNILM]论文实现:Unified Language Model Pre-training for Natural Language.........

文章目录 一、完整代码二、论文解读2.1 介绍2.2 架构2.3 输入端2.4 结果 三、过程实现四、整体总结 论文:Unified Language Model Pre-training for Natural Language Understanding and Generation 作者:Li Dong, Nan Yang, Wenhui Wang, Furu Wei,

几个与BERT相关的预训练模型分享-ERNIE,XLM,LASER,MASS,UNILM

基于Transformer的预训练模型汇总 1. ERNIE: Enhanced Language Representation with Informative Entities(THU) 特点:学习到了语料库之间得到语义联系,融合知识图谱到BERT中,本文解决了两个问题,structured knowledge encoding 和 Heterogeneous Information Fu