WeChat Mini Program
Old Version Features

Lactam Strategy Using Amide-Selective Nucleophilic Addition for Quick Access to Complex Amines: Unified Total Synthesis of Stemoamide-Type Alkaloids

Bulletin of the Chemical Society of Japan(2022)

Keio Univ

Cited 8|Views3
Abstract
Our research group has been exploring a lactam strategy for the concise total synthesis of complex alkaloids. In this article, we report full details of the unified total synthesis of stemoamide-type alkaloids by chemoselective assembly of five-membered rings based on the lactam strategy. First, the concise and gram-scale synthesis of tricyclic stemoamide was achieved by vinylogous Michael addition-reduction sequence of an unsaturated.-lactam with an unsaturated.-lactone, followed by N-alkylation to form the seven-membered ring. From stemoamide as a common intermediate, chemoselective nucleophilic addition of unsaturated lactone derivatives provides tetracyclic natural products. While stemonine is obtained by an Ir-catalyzed lactam-selective reductive Mannich reaction, saxorumamide and isosaxorumamide are produced through the lactone-selective nucleophilic addition of the lithiated 2-silyl furan. The developed conditions for the lactam-selective nucleophilic reactions are highly general, and were found to be applicable to the total synthesis of pentacyclic stemocochinin and isostemocochinin. The strategy enables the concise and unified total synthesis of tricyclic, tetracyclic and pentacyclic stemoamide-type alkaloids within 12 steps from a commercially available compound.
More
Translated text
Key words
Amide,Stemona alkaloid,Total synthesis
求助PDF
上传PDF
Bibtex
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined