WeChat Mini Program
Old Version Features

Phenotypic Characterization of the Wheat Temperature-Sensitive Leaf Color Mutant and Physical Mapping of Mutant Gene by Reduced-Representation Sequencing

PLANT SCIENCE(2023)

Key Laboratory of Plant Genetics and Breeding at Sichuan Agricultural University of Sichuan Province

Cited 7|Views17
Abstract
Few available leaf color mutants in crops have greatly limited the understanding of photosynthesis mechanisms, leading to few accomplishments in crop yield improvement via enhanced photosynthetic efficiency. Here, a noticeable albino mutant, CN19M06, was identified. A comparison between CN19M06 and the wild type CN19 at different temperatures showed that the albino mutant was temperature-sensitive and produced leaves with a decreased chlorophyll content at temperatures below 10 degrees C. Genetic analysis suggested that the albinism was controlled by one recessive nuclear gene named TSCA1, which was putatively assigned to the region of 718.1-729.8 Mb on chromosome 2AL using bulked-segregant analysis and double-digest restriction site -associated DNA. Finally, molecular linkage analysis physically anchored TSCA1 to a narrowed region of 718.8-725.3 Mb with a 6.5 Mb length on 2AL flanked by InDel 18 and InDel 25 with 0.7 cM genetic interval. Among the 111 annotated functional genes in the corresponding chromosomal region, only TraesC-S2A01G487900 of the PAP fibrillin family was both related to chlorophyll metabolism and temperature sensi-tivity; therefore, it was considered the putative candidate gene of TSCA1. Overall, CN19M06 has great potential for exploring the molecular mechanism of photosynthesis and monitoring temperature changes in wheat production.
More
Translated text
Key words
Wheat(Triticum aestivum L,Leaf color mutant,Temperature-sensitive,Physical mapping,Reduced-representation sequencing
求助PDF
上传PDF
Bibtex
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined