谷歌浏览器插件
订阅小程序
在清言上使用

Learning Critically: Selective Self-Distillation in Federated Learning on Non-IID Data

IEEE TRANSACTIONS ON BIG DATA(2024)

引用 11|浏览49
关键词
Data models,Training,Servers,Collaborative work,Adaptation models,Convergence,Feature extraction,Federated learning,knowledge distillation,non-identically distributed,deep learning,catastrophic forgetting
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要