Modulating Binding Strength and Acidity of Benzene-Derivative Ligands Enables Efficient and Hysteresis-Free Perovskite/Silicon Tandem Solar Cells.
Angewandte Chemie (International ed in English)(2025)
Abstract
Passivating defects at the wide-bandgap perovskite/C60 interface without impeding interfacial charge transport can effectively enhance the efficiency of perovskite/silicon tandem solar cells (TSCs). Herein, we study the impact of benzene-derivative ligands with elaborately modulated binding strength and acidity on wide-bandgap perovskites for high-performance perovskite/silicon TSCs. Specifically, the acidity/alkalinity and binding strength are preliminarily tuned using different functional groups of -PO₃H₂, -COOH, and -NH₂, and further finely adjusted by altering the chain lengths between the benzene ring and the functional groups. The results show that strong binding is indispensable for effectively suppressing voltage loss. However, the commonly used benzylphosphonic acid (BPPA) for firm surface binding exhibits too strong acidity that can etch the perovskite surface, resulting in halide-vacancy defects and pronounced hysteresis. Increasing the side chain length of BPPA to (2-phenylethyl)phosphonic acid not only enables a suitable acid dissociation constant (pKa) to avoid acid-induced etching but also achieves robust anchoring to the perovskite surface with a parallel adsorption orientation, which reduces the charge transport barrier at the interface. These properties enable strong-adsorption surface termination (SAST) of the perovskite surface while preventing acid-induced etching. As a result, the SAST strategy achieves a remarkable efficiency of 32.13% (certified 31.72%) for hysteresis-free perovskite/silicon TSCs.
MoreTranslated text
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined