F63. “FOCUS LESS ON THE WHERE AND MORE ON THE GOING”: CREATING SPACES TO NURTURE (DIVERSE) EARLY CAREER RESEARCHERS
European Neuropsychopharmacology(2024)
QIMR Berghofer | Queensland Inst Med Res
Abstract
Background The need to increase inclusion and diversity in psychiatric genetics, and genetics in general, is commonly discussed. However, the focus of these discussions typically centres on working with non-European ancestry populations to improve representation within data structures. Parallel to this, but less frequently discussed, there are several on-going programs working towards improving representation within the research community including the GINGER program and the SING consortium.Our capacity building program which is funded from two Australian Medical Research Future Fund grants (focusing on pharmacogenomics and polygenic prediction) aims to provide paid research trainee positions (one day a week for one university semester) for undergraduate level university students. Within our program, research trainees work in groups of four with mentors to collectively research topics relating to research processes that are being used in the studies funded by these grants. In the first round of this program, we offered eight traineeships (two groups of four). Methods To attract diverse applicants, we advertised these positions on the most commonly used job-advertisement website in Australia, and we also directly emailed relevant university departments at three local universities and asked them to share the information with their students. We included the following text in the advertisement:"Our team recognises the value brought by supporting diverse voices in the research process and within the research community. We sincerely believe this improves the quality of research we conduct, to ensure our work helps the entire community.From this perspective, we would like to encourage applications from people of Aboriginal and/or Torres Strait Islander descent.We would also like to welcome applications from people with culturally- and linguistically-diverse backgrounds, people with lived experience of mental-health challenges and people with LGBTQIA+ identities." Results We received over 300 applications for these positions. As we will discuss there was a marked diversity among the applicants with many applicants sharing information about themselves and their identities within their applications. Information about the structure and format of the program will also be presented. Discussion Our experience has shown that although there are no universities offering programs in complex trait or psychiatric genetics in Australia there is a marked interest in these areas among current students. Secondly, by including explicit statements about valuing diversity and inviting applications from people with diverse backgrounds and identities we are likely to receive applications that are more representative of the communities in which we are based. Through working with the trainees we hope to further develop the program, from optimising recruitment, to improving the research experience itself.
MoreTranslated text
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
GPU is busy, summary generation fails
Rerequest