Improving Urban Temperature Measurements and Two Applications
CITY AND ENVIRONMENT INTERACTIONS(2024)
ZHAW Sch Engn
Abstract
More extreme, frequent and longer heat waves negatively affect people all around the world, and especially inhabitants of urban areas which face even higher temperatures due to the urban heat island effect. A precondition to develop adaptation strategies to counteract adverse effects of heat in cities is to gain knowledge about the urban temperature distribution. One approach that has been applied in various cities is the implementation of dense urban temperature measurement networks. Since financial resources are usually limited, such networks consist of cost-effective measurement devices whose (daytime) data quality is prone to errors due to radiative influences. This was also the case in Zürich, Switzerland, where an urban temperature network with 272 measurement stations was operated from 2019 to 2021. In this study, we present a radiation correction method to enhance the data quality for practical use. Applying the proposed correction method led to a reduction in mean RMSE from 1.47 K to 0.57 K and in the overall mean bias from +0.88 K to +0.04 K. Following from that, we use the corrected database for two application cases: i) As a spatially and temporally high-resolution validation dataset for the physics-based large eddy simulation model PALM and ii) as input data for a geostatistical land use regression model. The analysis shows that the daytime radiation correction is crucial to detect the negative bias of the PALM model, which is most pronounced in the highly built-up area of Zürich, and to enhance the quality of the daytime land use regression. The developed radiation correction presented in this study can also be applied for other urban temperature networks that are facing similar challenges.
MoreTranslated text
Key words
Urban heat island,Low-cost measurement network,Radiation correction,Land use regression,Large eddy simulation
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined