Chrome Extension
WeChat Mini Program
Use on ChatGLM

Towards Fast Bayesian Inference of Equivalent Circuit Parameters of Perovskite Solar Cell

Jian Sun, Jintian Pan, Qing Song,Yang Liu,Yue Wang,Yonghua Chen, Deli Li

SOLAR ENERGY(2025)

Nanjing Tech Univ NanjingTech

Cited 0|Views2
Abstract
Interpreting current-voltage data through an equivalent circuit model offers a rapid, real-time method for analyzing the internal changes in perovskite solar cells. Recent studies have utilized Bayesian inference to solve this inverse problem, but the inference time remains a bottleneck for practical applications. In this work, we propose a modified approach that simplifies the likelihood function within the Bayesian inference process. By leveraging the inherent characteristics of the equivalent circuit, we reduce the five-parameter model to a more efficient three-parameter approximation, greatly enhancing parameter estimation efficiency. Our results demonstrate that the simplified model retains the same level of computational accuracy. In addition, it reduces inference time by a factor of over 15, from hundreds of seconds to just a few seconds on Google Colaboratory using a CPU with two Xeon cores (2.2 GHz). The number of inference steps also decreases from tens of thousands to only a few hundred. This significant improvement accelerates the overall inference process, enabling faster, real-time monitoring and aging analysis of perovskite solar cells. Our method offers an intelligent solution for efficiently analyzing solar cell performance through Bayesian inference, advancing both research and engineering applications.
More
Translated text
Key words
Metal halide perovskite,Bayesian inference,Equivalent circuit model,Solar cell
求助PDF
上传PDF
Bibtex
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Related Papers
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper

要点】:本文提出了一种简化的贝叶斯推断方法,通过将五参数模型降维至三参数,大幅提升了钙钛矿太阳能电池等效电路参数的估算效率,同时保证了计算精度,实现了实时监测和老化分析。

方法】:作者通过利用等效电路的固有特性,简化了贝叶斯推断过程中的似然函数,将五参数模型简化为三参数模型。

实验】:研究使用了Google Colaboratory上的CPU(配备两个Xeon核心,2.2 GHz)进行实验,实验结果表明,简化后的模型在计算精度相当的情况下,将推断时间从数百秒减少到几秒,推断步骤从数万次减少到仅数百次。论文中未具体提及所使用的数据集名称。