Chrome Extension
WeChat Mini Program
Use on ChatGLM

DenseFormer-MoE: A Dense Transformer Foundation Model with Mixture of Experts for Multi-Task Brain Image Analysis

Rizhi Ding,Hui Lu,Manhua Liu

IEEE transactions on medical imaging(2025)

Cited 0|Views5
Key words
Foundation Model,Mixture of Experts,Self-Supervised Learning,Multi-Task Learning,Transformer,Brain Disease
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined