Numerical Model Generation of Test Frames for Pre-launch Studies of EarthCARE’s Retrieval Algorithms and Data Management System
ATMOSPHERIC MEASUREMENT TECHNIQUES(2023)
Environm & Climate Change Canada
Abstract
The Earth Cloud, Aerosol and Radiation Explorer (EarthCARE) satellite consists of active and passive sensors whose observations will be acted on by an array of retrieval algorithms. EarthCARE's retrieval algorithms have undergone pre-launch verifications within a virtual observing system that consists of 3D atmosphere-surface data produced by the Global Environmental Multiscale (GEM) numerical weather prediction (NWP) model, as well as instrument simulators that when applied to NWP data yield synthetic observations for EarthCARE's four sensors. Retrieval algorithms operate on the synthetic observations, and their estimates go into radiative transfer models that produce top-of-atmosphere solar and thermal broadband radiative quantities, which are compared to synthetic broadband measurements, thus mimicking EarthCARE's radiative closure assessment. Three high-resolution test frames were simulated; each measures similar to 6200 km along-track by 200 km across-track. Horizontal grid spacing is 250 m, and there are 57 atmospheric layers up to 10 mbar. The frames span wide ranges of conditions and extend over (i) Greenland to the Caribbean, crossing a cold front off Nova Scotia; (ii) Nunavut to Baja California, crossing over Colorado's Rocky Mountains; and (iii) the central equatorial Pacific Ocean, which includes a mesoscale convective system. This report discusses how the test frames were produced and presents their key geophysical features. All data are publicly available and, owing to their high-resolution, could be used to simulate observations for other measurement systems.
MoreTranslated text
Key words
Satellite Observations,Remote Sensing,Atmospheric
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined