Keratinocyte Integrin Α3β1 Induces Expression of the Macrophage Stimulating Factor, CSF-1, Through a YAP/TEAD-dependent Mechanism.
Matrix biology journal of the International Society for Matrix Biology(2024)
Department of Surgery
Abstract
The development of wound therapy targeting integrins is hampered by inadequate understanding of integrin function in cutaneous wound healing and the wound microenvironment. Following cutaneous injury, keratinocytes migrate to restore the skin barrier, and macrophages aid in debris clearance. Thus, both keratinocytes and macrophages are critical to the coordination of tissue repair. Keratinocyte integrins have been shown to participate in this coordinated effort by regulating secreted factors, some of which crosstalk to distinct cells in the wound microenvironment. Epidermal integrin α3β1 is a receptor for laminin-332 in the cutaneous basement membrane. Here we show that wounds deficient in epidermal α3β1 express less epidermal-derived macrophage colony-stimulating factor 1 (CSF-1), the primary macrophage-stimulating growth factor. α3β1-deficient wounds also have fewer wound-proximal macrophages, suggesting that keratinocyte α3β1 may stimulate wound macrophages through the regulation of CSF-1. Indeed, using a set of immortalized keratinocytes, we demonstrate that keratinocyte-derived CSF-1 supports macrophage growth, and that α3β1 regulates Csf1 expression through Src-dependent stimulation of Yes-associated protein (YAP)-Transcriptional enhanced associate domain (TEAD)-mediated transcription. Consistently, α3β1-deficient wounds in vivo display a substantially reduced number of keratinocytes with YAP-positive nuclei. Overall, our current findings identify a novel role for epidermal integrin α3β1 in regulating the cutaneous wound microenvironment by mediating paracrine crosstalk from keratinocytes to wound macrophages, implicating α3β1 as a potential target of wound therapy.
MoreTranslated text
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined