诸神缄默不语-个人CSDN博文目录 诸神缄默不语的论文阅读笔记和分类 论文名:Cross-lingual Language Model Pretraining 模型简称:XLM ArXiv地址:https://arxiv.org/abs/1901.07291 这是2019年NeurIPS的论文,主要做到就是跨语言BERT。主要创新点就是做了多语言的BERT预训练,改了一下放数据的方式(TLM
Motivation Problem Setting: a) One source language with rich labeled data.b) No labeled data in the target language. 现有的 Cross-lingula NER 方法可以分为两大类: a) Label projection (generate labeled data in tar
Jiateng Xie Neual Cross-Lingual Named Entity Recognition, CMU Abstract 本文提出了两种方法来解决 under the unsupervised transfer setting 下 cross-lingual NER 中的挑战。lexical mapping (STEP 1-3). word ordering (STEP
Unsupervised cross-lingual representation learning at scale(ACL 2019) 【论文链接】:https://aclanthology.org/2020.acl-main.747.pdf 【代码链接】:https://github.com/facebookresearch/XLM 【来源】:由Facebook AI Research