Assessing Algorithmic Fairness with a Multimodal Artificial Intelligence Model in Men of African and Non-African Origin on NRG Oncology Prostate Cancer Phase III Trials.
JCO clinical cancer informatics(2025)
UCSF Medical Center | Artera | University of California San Francisco | Northwestern University | University of Michigan Comprehensive Cancer Center | Hematology-Oncology Medical Group of Fresno Inc | Horizon Health Network-Saint John Regional Hospital | Ingalls Memorial Hospital | CHUM-Centre Hospitalier de l'Universite de Montreal | Nova Scotia Cancer CentreNova Scotia HealthQEII Health Sciences Centre | VA Boston Healthcare System | University of Missouri-Ellis Fischel | The Research Institute of the McGill University Health Centre (MUHC) | Keck School of Medicine of USC | Cedars-Sinai Medical Center | University Hospitals Seidman Cancer Center | NRG Oncology Statistics and Data Management Center | Johns Hopkins UniversitySidney Kimmel Cancer Center
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
