Extended Pan-Spectrum Fitting of Energetic Particle Spectra
Earth and Planetary Physics(2025)
School of Earth and Space Sciences
Abstract
The energy spectrum of energetic particles in space often shows a non-thermal spectral shape with two spectral transitions/breaks over a wide energy range, carrying crucial information about their acceleration, release and transportation process. To self-consistently characterize the spectral features of energetic particles, here we propose a novel extended pan-spectrum (EPS) formula to fit the particle energy-flux spectrum, which has the merit that can incorporate many commonly used spectrum functions with one spectral transition, including the pan-spectrum, double-power-law, Kappa, Ellison-Ramaty (ER) functions, etc. The formula can also determine the spectral shape with two spectral transitions, including the triple-power-law function, Kappa distribution (at low energy) plus power law (at high energy), power law (at low energy) plus ER function, etc. Considering the uncertainties in both J and E, we can fit this EPS formula well to the representative energy spectra of various particle phenomena in space, including solar energetic particles (electrons, protons, 3He and heavier ions), anomalous cosmic rays, solar wind suprathermal particles (halo and superhalo electrons; pick-up ions and the suprathermal tail), etc. Therefore, the EPS fitting can help us self-consistently determine the spectral features of different particle phenomena, and improve our understanding of the physical nature of the origin, acceleration, and transportation of energetic particles in space.
MoreTranslated text
Key words
energy spectrum fitting,solar energetic particle,solar wind suprathermal particle,anomalous cosmic ray,pick-up ion
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper