谷歌浏览器插件
订阅小程序
在清言上使用

BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

North American Chapter of the Association for Computational Linguistics(2019)

引用 130783|浏览9081
关键词
Language Modeling,Word Representation,Machine Translation,Neural Machine Translation,Syntax-based Translation Models
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要