Learning Critically: Selective Self-Distillation in Federated Learning on Non-IID Data
IEEE TRANSACTIONS ON BIG DATA(2024)
关键词
Data models,Training,Servers,Collaborative work,Adaptation models,Convergence,Feature extraction,Federated learning,knowledge distillation,non-identically distributed,deep learning,catastrophic forgetting
AI 理解论文
溯源树
样例

生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要