Dipole Molecule-Mediated Modulating Residual PbI2 Clusters in Two-Step-Processing Inverted Perovskite Photovoltaics.
NANO LETTERS(2024)
Abstract
The precise modulation of PbI2 presence is of paramount importance in the domain of perovskite solar cell fabrication, particularly when employing the two-step method. The distinct crystallization trajectory inherent to this method often leaves unreacted PbI2 at the buried interface, which can create a large number of defect states. To address this challenge, we have introduced a strategic predeposition of the dipole molecule, 3-(decyldimethylammonio)propane sulfonate inner salt (3DPSI). This intervention serves to regulate residual PbI2 clusters and quash the emergence of associated derivative defects, such as Pb0, VI, and VFA. Through a synergistic approach combining experimental precision with theoretical rigor, we gained profound insights into the enhancement of crystal quality and the effective suppression of defects. The predeposition of the dipole molecule has yielded a remarkable power conversion efficiency of 24.62% in two-step-processing inverted perovskite photovoltaics and significantly improved the stability under continuous illumination.
MoreTranslated text
Key words
dipole molecule,PbI2 clusters,perovskite solar cell,buried interface,two-stepmethod
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Related Papers
1996
被引用108849 | 浏览
2010
被引用161 | 浏览
2014
被引用1099 | 浏览
2014
被引用1427 | 浏览
2014
被引用34 | 浏览
2018
被引用95 | 浏览
2019
被引用995 | 浏览
2020
被引用867 | 浏览
2023
被引用73 | 浏览
2023
被引用16 | 浏览
2023
被引用41 | 浏览
2023
被引用178 | 浏览
2023
被引用147 | 浏览
2024
被引用81 | 浏览
2024
被引用2 | 浏览
2024
被引用31 | 浏览
2024
被引用22 | 浏览
2024
被引用304 | 浏览
2024
被引用14 | 浏览
2024
被引用25 | 浏览
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper