WeChat Mini Program
Old Version Features

Role of Repulsive Forces in Determining the Equilibrium Structure of Simple Liquids

JOURNAL OF CHEMICAL PHYSICS(1971)

university of california san diego

Cited 5955|Views18
Abstract
The different roles the attractive and repulsive forces play in forming the equilibrium structure of a Lennard-Jones liquid are discussed. It is found that the effects of these forces are most easily separated by considering the structure factor (or equivalently, the Fourier transform of the pair-correlation function) rather than the pair-correlation function itself. At intermediate and large wave vectors, the repulsive forces dominate the quantitative behavior of the liquid structure factor. The attractions are manifested primarily in the small wave vector part of the structure factor; but this effect decreases as the density increases and is almost negligible at reduced densities higher than 0.65. These conclusions are established by considering the structure factor of a hypothetical reference system in which the intermolecular forces are entirely repulsive and identical to the repulsive forces in a Lennard-Jones fluid. This reference system structure factor is calculated with the aid of a simple but accurate approximation described herein. The conclusions lead to a very simple prescription for calculating the radial distribution function of dense liquids which is more accurate than that obtained by any previously reported theory. The thermodynamic ramifications of the conclusions are presented in the form of calculations of the free energy, the internal energy (from the energy equation), and the pressure (from the virial equation). The implications of our conclusions to perturbation theories for liquids and to the interpretation of x-ray scattering experiments are discussed.
More
Translated text
求助PDF
上传PDF
Bibtex
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined