WeChat Mini Program
Old Version Features

Role of Hypertension on the Severity of COVID-19: A Review.

Journal of Cardiovascular Pharmacology(2021)

Cent South Univ

Cited 77|Views10
Abstract
Abstract: The novel coronavirus disease (COVID-19) caused by the severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) has rapidly evolved into a global pandemic. The substantial morbidity and mortality associated with the infection has prompted us to understand potential risk factors that can predict patient outcomes. Hypertension has been identified as the most prevalent cardiovascular comorbidity in patients infected with COVID-19 that demonstrably increases the risk of hospitalization and death. Initial studies implied that renin–angiotensin–aldosterone system inhibitors might increase the risk of viral infection and aggravate disease severity, thereby causing panic given the high global prevalence of hypertension. Nonetheless, subsequent evidence supported the administration of antihypertensive drugs and noted that they do not increase the severity of COVID-19 infection in patients with hypertension, rather may have a beneficial effect. To date, the precise mechanism by which hypertension predisposes to unfavorable outcomes in patients infected with COVID-19 remains unknown. In this mini review, we elaborate on the pathology of SARS-CoV-2 infection coexisting with hypertension and summarize potential mechanisms, focusing on the dual roles of angiotensin-converting enzyme 2 and the disorders of renin–angiotensin–aldosterone system in COVID-19 and hypertension. The effects of proinflammatory factors released because of immune response and gastrointestinal dysfunction in COVID-19 are also discussed.
More
Translated text
Key words
SARS-CoV-2,severe COVID-19,hypertension,angiotensin-converting enzyme 2,renin-angiotensin-aldosterone system,antihypertensive drugs
求助PDF
上传PDF
Bibtex
收藏
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined