谷歌浏览器插件
订阅小程序
在清言上使用

A Comparison of Supervised and Unsupervised Pre-Training of End-to-end Models

INTERSPEECH 2021(2021)

引用 18|浏览37
关键词
speech recognition,cross-domain,cross-lingual,low-resource,pre-training,self-supervised learning,supervised training,unsupervised training
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要