Integrating Bulk and Single-Cell Sequencing Data to Construct a Scissor+ Dendritic Cells Prognostic Model for Predicting Prognosis and Immune Responses in ESCC.
Cancer Immunology Immunotherapy(2024)
The First Affiliated Hospital of Sun Yat-Sen University | Sun Yat-Sen University Cancer Center | Sun Yat-Sen University
Abstract
Esophageal squamous cell carcinoma (ESCC) is characterized by molecular heterogeneity with various immune cell infiltration patterns, which have been associated with therapeutic sensitivity and resistance. In particular, dendritic cells (DCs) are recently discovered to be associated with prognosis and survival in cancer. However, how DCs differ among ESCC patients has not been fully comprehended. Recently, the advance of single-cell RNA sequencing (scRNA-seq) enables us to profile the cell types, states, and lineages in the heterogeneous ESCC tissues. Here, we dissect the ESCC tumor microenvironment at high resolution by integrating 192,078 single cells from 60 patients, including 4379 DCs. We then used Scissor, a method that identifies cell subpopulations from single-cell data that are associated bulk samples with genomic and clinical information, to stratify DCs into Scissorhi and Scissorlow subtypes. We applied the Scissorhi gene signature to stratify ESCC scRNAseq patient, and we found that PD-L1, TIGIT, PVR and IL6 ligand-receptor-mediated cell interactions existed mainly in Scissorhi patients. Finally, based on the Scissor results, we successfully developed a validated prognostic risk model for ESCC and further validated the reliability of the risk prediction model by recruiting 40 ESCC clinical patients. This information highlights the importance of these genes in assessing patient prognosis and may help in the development of targeted or personalized therapies for ESCC.
MoreTranslated text
Key words
ESCC,DC,SCISSOR
PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Try using models to generate summary,it takes about 60s
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Related Papers
JAG2: A Potential Biomarker for Microtia Identified by Integrated RNA Transcriptome Analysis
CURRENT GENOMICS 2024
被引用0
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
去 AI 文献库 对话