AaRgs2 Regulates Growth and Virulence of Alternaria Alternata by Maintaining Reactive Oxygen Species (ROS) Balance
Postharvest Biology and Technology(2025)
College of Food Science and Engineering
Abstract
Regulator of G protein AaRgs2 plays an important role in vegetative growth, melanin production and secretion, stress response and appressorium-like formation in Alternaria alternata, the causal agent of pear black spot. Especially AaRgs2 is involved in A. alternata response to oxidative stress, however, its detailed regulatory mechanism is unclear. The regulatory function of AaRgs2 on ROS metabolism of A. alternata were systematically evaluated through target gene knockout technique. The results showed that deletion of AaRgs2 upregulated NADPH oxidase related genes expression, led to an increase in intracellular ROS content and inhibited the gowth of A. alternata, exogenous addition of NADPH oxidase inhibitor DPI partially restored the growth of mutant ΔAaRgs2. In addition, down-regulated expression level of related detoxificated ROS enzymes genes including AaCAT, glutathione cycle and thioredoxin system-related enzyme genes were found in ΔAaRgs2 mutant. Correspondingly, deletion of AaRgs2 also resulted in a decrease in the ratios of GSH/GSSG and thioredoxin reductase (TRXR) content. The high level of ROS in the mutant ΔAaRgs2 led to cell membrane damage by disrupting ergosterol synthesis and increasing malondialdehyde (MDA) content. The Alternaria mycotoxin TEN was decreased in the mutant ΔAaRgs2. The virulence of mutant ΔAaRgs2 on pear and tomato fruit was reduced and the virulence of mutant ΔAaRgs2 treated with exogenous DPI was partially restored. The findings suggested that AaRgs2 might regulate growth and pathogenicity of A. alternata by maintaining ROS balance.
MoreTranslated text
Key words
Regulator of G protein,Fungi,ROS homeostasis,Cell membrane,Pathogenicity
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper