DenseFormer-MoE: A Dense Transformer Foundation Model with Mixture of Experts for Multi-Task Brain Image Analysis
IEEE transactions on medical imaging(2025)
Key words
Foundation Model,Mixture of Experts,Self-Supervised Learning,Multi-Task Learning,Transformer,Brain Disease
AI Read Science
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined