PREDICTION OF BIOCHEMICAL RESPONSE IN PATIENTS WITH PRIMARY BILIARY CHOLANGITIS TREATED WITH OBETICHOLIC ACID: DERIVATION AND EXTERNAL VALIDATION OF THE OCA RESPONSE SCORE (ORS)
HEPATOLOGY(2023)
Univ Campus Biomed Rome | Virgen del Rocio Univ Hosp | Univ Turin | Univ Milano Bicocca | Univ Milan | Sapienza Univ Rome | Humanitas Univ | Fdn Irccs Ca Granda Osped Maggiore Policlin | Asst GOM Niguarda | Univ Genoa | Osped Policlin San Martino | Santa Maria Croci Hosp | Natl Inst Gastroenterol S De Bellis | Fdn Gemelli Hosp Irccs | Fdn Casa Sollievo Della Sofferenza Irccs San Giov | Fdn Casa Sollievo Della Sofferenza Irccs | Asst Lecco Hosp | Univ Cagliari | Osped Barletta | Pescara Gen Hosp | Arnas Garibaldi | Policlin Vittorio Emanuele | Hosp Univ 12 Octubre | Hosp Univ Reina Sofia | Complexo Hosp Univ Santiago | Hosp Univ La Fe | Hosp Univ Lozano Blesa | Ctr Hosp Univ S Joao | Hosp Badalona Germans Trias & Pujol | Ctr Hosp Tras Os Montes & Alto Douro | Ctr Hosp Univ Coimbra | Hosp Univ La Paz | Hosp Univ Canarias | Ctr Hosp & Univ Coimbra | Hosp Univ Fdn Alcorcon | Univ Hosp Pisa | Univ Padua | Univ Tor Vergata | Osped Cotugno | Univ Florence | Univ Hosp St Anna | Univ Perugia | Magna Graecia Univ Catanzaro | Univ Cattolica Sacro Cuore | San Paolo Hosp | Cardinal Massaia Hosp | Univ Piemonte Orientale | Osped Regina Apostolorum Albano Laziale | Azienda Osped S Andrea | Univ Palermo | Policlin Bari Hosp | Ist Cura Citta Pavia | Univ Politecn Marche | Azienda Sanit Locale Biella | Policlin Gemelli Sapienza Univ | Spedali Civili Gardone Val Trompia | Valtellina & Alto Lario Hosp | Guido Salvini Hosp | Osped SS Annunziata Sassari | Spedali Civil Brescia | Azienda Osped Univ Maggiore Carita
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
