138 research outputs found

    A guide for many authors:Writing manuscripts in large collaborations

    Get PDF
    Writing manuscripts collaboratively affords both opportunities and challenges: Collaborative papers can benefit from the expertise, perspectives, and collective effort of the group but can lack coherence or be produced inefficiently. When collaborations are large, involving tens or hundreds of researchers, there are more and different opportunities and challenges, like appropriately crediting the contributions of many people. This paper is a practical guide for authors writing collaborative manuscripts, particularly those working in large collaborations. We emphasize the importance of deliberate leadership and describe five general strategies that lead authors can employ to maximize opportunities and navigate challenges: care in recruiting the author team, care in crediting the author team, clear and frequent communication, organized materials, and deliberate and early decision-making. For each, we offer specific tips in line with these strategies (e.g., use collaboration agreements, leverage Open Science practices). We then suggest how lead authors can structure the writing and revising process to produce a coherent manuscript and offer tips for submitting papers and responding to peer-reviews. A repository of resources for people writing manuscripts in collaborations is available at osf.io/dzwcn

    Psychological Science Accelerator: A Promising Resource for Clinical Psychological Science

    Get PDF
    The Psychological Science Accelerator (PSA) is an international collaborative network of psychological scientists that facilitates rigorous and generalizable research. In this chapter, we describe how the PSA can help clinical psychologists and clinical psychological science more broadly. We first describe the PSA and outline how individual clinical psychologists can use the PSA as a helpful resource in numerous capacities: leading or contributing to clinical research or research with clinical relevance, building collaborative relationships, obtaining experience and expertise, and learning about systems and tools, particularly those related to open science practices, that they can adapt to their own research. We then describe how the PSA supports rigor and transparency at each stage of the research process. Finally, we discuss the challenges of the PSA’s large, collaborative approach to research

    The Psychological Science Accelerator: Advancing Psychology through a Distributed Collaborative Network

    Get PDF
    Concerns have been growing about the veracity of psychological research. Many findings in psychological science are based on studies with insufficient statistical power and nonrepresentative samples, or may otherwise be limited to specific, ungeneralizable settings or populations. Crowdsourced research, a type of large-scale collaboration in which one or more research projects are conducted across multiple lab sites, offers a pragmatic solution to these and other current methodological challenges. The Psychological Science Accelerator (PSA) is a distributed network of laboratories designed to enable and support crowdsourced research projects. These projects can focus on novel research questions, or attempt to replicate prior research, in large, diverse samples. The PSA’s mission is to accelerate the accumulation of reliable and generalizable evidence in psychological science. Here, we describe the background, structure, principles, procedures, benefits, and challenges of the PSA. In contrast to other crowdsourced research networks, the PSA is ongoing (as opposed to time-limited), efficient (in terms of re-using structures and principles for different projects), decentralized, diverse (in terms of participants and researchers), and inclusive (of proposals, contributions, and other relevant input from anyone inside or outside of the network). The PSA and other approaches to crowdsourced psychological science will advance our understanding of mental processes and behaviors by enabling rigorous research and systematically examining its generalizability

    Replicability, Robustness, and Reproducibility in Psychological Science

    Get PDF
    Replication—an important, uncommon, and misunderstood practice—is gaining appreciation in psychology. Achieving replicability is important for making research progress. If findings are not replicable, then prediction and theory development are stifled. If findings are replicable, then interrogation of their meaning and validity can advance knowledge. Assessing replicability can be productive for generating and testing hypotheses by actively confronting current understandings to identify weaknesses and spur innovation. For psychology, the 2010s might be characterized as a decade of active confrontation. Systematic and multi-site replication projects assessed current understandings and observed surprising failures to replicate many published findings. Replication efforts highlighted sociocultural challenges such as disincentives to conduct replications and a tendency to frame replication as a personal attack rather than a healthy scientific practice, and they raised awareness that replication contributes to self-correction. Nevertheless, innovation in doing and understanding replication and its cousins, reproducibility and robustness, has positioned psychology to improve research practices and accelerate progress

    The Psychological Science Accelerator: Advancing Psychology through a Distributed Collaborative Network

    Get PDF
    Concerns have been growing about the veracity of psychological research. Many findings in psychological science are based on studies with insufficient statistical power and nonrepresentative samples, or may otherwise be limited to specific, ungeneralizable settings or populations. Crowdsourced research, a type of large-scale collaboration in which one or more research projects are conducted across multiple lab sites, offers a pragmatic solution to these and other current methodological challenges. The Psychological Science Accelerator (PSA) is a distributed network of laboratories designed to enable and support crowdsourced research projects. These projects can focus on novel research questions, or attempt to replicate prior research, in large, diverse samples. The PSA\u27s mission is to accelerate the accumulation of reliable and generalizable evidence in psychological science. Here, we describe the background, structure, principles, procedures, benefits, and challenges of the PSA. In contrast to other crowdsourced research networks, the PSA is ongoing (as opposed to time-limited), efficient (in terms of re-using structures and principles for different projects), decentralized, diverse (in terms of participants and researchers), and inclusive (of proposals, contributions, and other relevant input from anyone inside or outside of the network). The PSA and other approaches to crowdsourced psychological science will advance our understanding of mental processes and behaviors by enabling rigorous research and systematically examining its generalizability

    The Psychological Science Accelerator: Advancing Psychology Through a Distributed Collaborative Network

    Get PDF
    Source at https://doi.org/10.1177/2515245918797607.Concerns about the veracity of psychological research have been growing. Many findings in psychological science are based on studies with insufficient statistical power and nonrepresentative samples, or may otherwise be limited to specific, ungeneralizable settings or populations. Crowdsourced research, a type of large-scale collaboration in which one or more research projects are conducted across multiple lab sites, offers a pragmatic solution to these and other current methodological challenges. The Psychological Science Accelerator (PSA) is a distributed network of laboratories designed to enable and support crowdsourced research projects. These projects can focus on novel research questions or replicate prior research in large, diverse samples. The PSA’s mission is to accelerate the accumulation of reliable and generalizable evidence in psychological science. Here, we describe the background, structure, principles, procedures, benefits, and challenges of the PSA. In contrast to other crowdsourced research networks, the PSA is ongoing (as opposed to time limited), efficient (in that structures and principles are reused for different projects), decentralized, diverse (in both subjects and researchers), and inclusive (of proposals, contributions, and other relevant input from anyone inside or outside the network). The PSA and other approaches to crowdsourced psychological science will advance understanding of mental processes and behaviors by enabling rigorous research and systematic examination of its generalizability

    The Psychological Science Accelerator's COVID-19 rapid-response dataset

    Get PDF

    The psychological science accelerator’s COVID-19 rapid-response dataset

    Get PDF
    In response to the COVID-19 pandemic, the Psychological Science Accelerator coordinated three large-scale psychological studies to examine the effects of loss-gain framing, cognitive reappraisals, and autonomy framing manipulations on behavioral intentions and affective measures. The data collected (April to October 2020) included specific measures for each experimental study, a general questionnaire examining health prevention behaviors and COVID-19 experience, geographical and cultural context characterization, and demographic information for each participant. Each participant started the study with the same general questions and then was randomized to complete either one longer experiment or two shorter experiments. Data were provided by 73,223 participants with varying completion rates. Participants completed the survey from 111 geopolitical regions in 44 unique languages/dialects. The anonymized dataset described here is provided in both raw and processed formats to facilitate re-use and further analyses. The dataset offers secondary analytic opportunities to explore coping, framing, and self-determination across a diverse, global sample obtained at the onset of the COVID-19 pandemic, which can be merged with other time-sampled or geographic data
    corecore