WeChat Mini Program
Old Version Features

Computational Fluid Dynamics Analysis of Small Airway Deformation in COPD

AMERICAN JOURNAL OF RESPIRATORY AND CRITICAL CARE MEDICINE(2024)

Katholieke Univ Leuven

Cited 0|Views3
Abstract
Small airway disease including obstruction is an important contributor to the decline in lung function observed in COPD. However, the mechanisms underlying small airway disease are not fully understood. We investigated the fluid dynamic alterations resulting from small airway deformation and obstruction in the lower airway zones up to the functional terminal bronchioles (TB) using computational fluid dynamics (CFD). Small cylinders (1.4cm diameter) obtained from explanted donor (n=1) and COPD (n=1) lungs frozen at TLC were µCT scanned (resolution 10 µm). Samples were matched for number of generations and segments. Small airways were segmented and modeled for CFD simulation. Under the same inlet flow rate and outlet pressure conditions, the COPD small airways showed a significantly higher magnitude of pressure drop and wall shear stress compared to the donor. Additionally, to understand the mechanism contributing to TB obstruction in the COPD sample, a new model was created by allowing flow out of the occlusion regions in order to simulate the flow before obstruction. A higher pressure drop (27%) was observed compared to donor, while the value of wall shear stress peaked. Results demonstrate(fig1) that airflow in COPD experiences higher resistance due to deformations and occlusions downstream. The corrected model suggests that flow parameters, particularly wall shear stress, could be a contributing factor to TB obstruction in COPD.
More
Translated text
Key words
COPD,lung function,Respiratory Tract Deposition,pulmonary disease,Respiratory
求助PDF
上传PDF
Bibtex
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined