[Research on Enhancement of Mental Rotation Ability Based on Transcranial Direct Current Stimulation].
Department of Graduate School
Abstract
Transcranial direct current stimulation (tDCS) is a non-invasive low-current brain stimulation technique, which is mainly based on the different polarity of electrode stimulation to make the activation threshold of neurons different, thereby regulating the excitability of the cerebral cortex. In this paper, healthy subjects were randomly divided into three groups: anodal stimulation group, cathodal stimulation group and sham stimulation group, with 5 subjects in each group. Then, the performance data of the three groups of subjects were recorded before and after stimulation to test their mental rotation ability, and resting state and task state electroencephalogram (EEG) data were collected. Finally, through comparative analysis of the behavioral data and EEG data of the three groups of subjects, the effect of electrical stimulation of different polarities on the three-dimensional mental rotation ability was explored. The results of the study found that the correct response time/accuracy rate and the accuracy rate performance of the anodal stimulation group were higher than those of the cathodal stimulation and sham stimulation groups, and there was a significant difference ( P < 0.05). The alpha wave power analysis found that the mental rotation mainly activates the frontal lobe, central area, parietal lobe and occipital lobe. In the anodal stimulation group, the alpha wave power changed significantly in the frontal lobe and occipital lobe ( P < 0.05). The results of this paper show that anodal stimulation group can improve the mental rotation ability of the subjects to a certain extent. The results of this paper can provide important theoretical support for further research on the mechanism of tDCS on mental rotation ability.
MoreTranslated text
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Related Papers
1987
被引用60 | 浏览
2012
被引用5 | 浏览
2015
被引用85 | 浏览
2017
被引用11 | 浏览
2018
被引用128 | 浏览
2019
被引用13 | 浏览
2020
被引用19 | 浏览
2020
被引用5 | 浏览
2018
被引用19 | 浏览
2021
被引用8 | 浏览
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper