Expression Profile, Subcellular Localization of MARCH4 and Transcriptome Analysis of Its Potential Regulatory Signaling Pathway in Large Yellow Croaker (larimichthys Crocea)
Fish & Shellfish Immunology(2022)
Key Laboratory of Healthy Mariculture for the East China Sea
Abstract
Membrane-associated RING-CH (MARCH) family, as Ring-type E3 ligases, have attracted extensive attention to their immune functions. MARCH4 plays an essential role in regulating immune response in mammal. In the present study, it is the first to report on MARCH4 characteristics and signal pathway in fish. MARCH4 in large yellow croaker Larimichthys crocea (named as LcMARCH4) encodes a RING-CH domain and two TM domains, as well as other function domains, including an N-terminal proline rich domain, an AxxxG-motif in TM1, a tyrosine-based YXXØ motif, and a C-terminal PDZ-binding domain. LcMARCH4 is a tissue-specific protein with highly significant expression in brain. The mRNA transcripts of LcMARCH4 were significantly induced in the main organs (skin, gill, spleen, and head-kidney) by C. irritans infection. Consistently, significant increase was observed in spleen and head-kidney after LPS, Poly I:C stimulation and V. parahaemolyticus infection. Subcellular localization analysis showed that LcMARCH4 was localized in the cytoplasm and membrane. Moreover, we found 46 DEGs in a comparative transcriptome analysis between the LcMARCH4 overexpression group and control vector group. The analysis showed that HSPA6, HSPA1B and DNAJB1 might play important regulatory roles to MARCH4 in fish. Notably, two noncoding RNA, both RN7SL1 and RN7SL2, the expression levels went up in MARCH4 overexpression cells. Taken together, this study will provide new insights into finfish MARCH4 and its potential regulatory signaling pathway as well.
MoreTranslated text
Key words
Large yellow croaker,MARCH4,Immune response,Subcellular localization,Transcriptome analysis
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined