Atmospheric Muons Measured with the KM3NeT Detectors in Comparison with Updated Numeric Predictions
The European Physical Journal C(2024)
INFN | IPHC UMR 7178 | Aix Marseille Univ | IFIC-Instituto de Física Corpuscular (CSIC-Universitat de València) | Complesso Universitario di Monte S. Angelo | Universitat Politècnica de Catalunya | NCSR Demokritos | University of Granada | Nantes Université | Universitat Politècnica de València | Nikolaus-Fiebiger-Straße 2 | University Mohammed V in Rabat | Université Paris Cité | LPC CAEN | Czech Technical University in Prague | Nikhef | University of Hull | North-West University | University Mohammed I | ISS | Cadi Ayyad University | University of the Witwatersrand | Julius-Maximilians-Universität Würzburg | Comenius University in Bratislava | Western Sydney University | NIOZ (Royal Netherlands Institute for Sea Research) | AstroCeNT | Tbilisi State University | The University of Georgia | University of Johannesburg | Laboratoire Univers et Particules de Montpellier | Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU)
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance

Search for Neutrino Emission from GRB 221009A Using the KM3NeT ARCA and ORCA Detectors
被引用1