4 research outputs found

    A multicohort, longitudinal study of cerebellar development in attention deficit hyperactivity disorder

    No full text
    BackgroundThe cerebellum supports many cognitive functions disrupted in attention deficit hyperactivity disorder (ADHD). Prior neuroanatomic studies have been often limited by small sample sizes, inconsistent findings, and a reliance on cross‐sectional data, limiting inferences about cerebellar development. Here, we conduct a multicohort study using longitudinal data, to characterize cerebellar development.MethodsGrowth trajectories of the cerebellar vermis, hemispheres and white matter were estimated using piecewise linear regression from 1,656 youth; of whom 63% had longitudinal data, totaling 2,914 scans. Four cohorts participated, all contained childhood data (age 4–12 years); two had adolescent data (12–25 years). Growth parameters were combined using random‐effects meta‐analysis.ResultsDiagnostic differences in growth were confined to the corpus medullare (cerebellar white matter). Here, the ADHD group showed slower growth in early childhood compared to the typically developing group (left corpus medullare z = 2.49, p = .01; right z = 2.03, p = .04). This reversed in late childhood, with faster growth in ADHD in the left corpus medullare (z = 2.06, p = .04). Findings held when gender, intelligence, comorbidity, and psychostimulant medication were considered.DiscussionAcross four independent cohorts, containing predominately longitudinal data, we found diagnostic differences in the growth of cerebellar white matter. In ADHD, slower white matter growth in early childhood was followed by faster growth in late childhood. The findings are consistent with the concept of ADHD as a disorder of the brain's structural connections, formed partly by developing cortico‐cerebellar white matter tracts

    Same data, different conclusions : radical dispersion in empirical results when independent analysts operationalize and test the same hypothesis

    No full text
    In this crowdsourced initiative, independent analysts used the same dataset to test two hypotheses regarding the effects of scientists’ gender and professional status on verbosity during group meetings. Not only the analytic approach but also the operationalizations of key variables were left unconstrained and up to individual analysts. For instance, analysts could choose to operationalize status as job title, institutional ranking, citation counts, or some combination. To maximize transparency regarding the process by which analytic choices are made, the analysts used a platform we developed called DataExplained to justify both preferred and rejected analytic paths in real time. Analyses lacking sufficient detail, reproducible code, or with statistical errors were excluded, resulting in 29 analyses in the final sample. Researchers reported radically different analyses and dispersed empirical outcomes, in a number of cases obtaining significant effects in opposite directions for the same research question. A Boba multiverse analysis demonstrates that decisions about how to operationalize variables explain variability in outcomes above and beyond statistical choices (e.g., covariates). Subjective researcher decisions play a critical role in driving the reported empirical results, underscoring the need for open data, systematic robustness checks, and transparency regarding both analytic paths taken and not taken. Implications for organizations and leaders, whose decision making relies in part on scientific findings, consulting reports, and internal analyses by data scientists, are discussed

    Same data, different conclusions: Radical dispersion in empirical results when independent analysts operationalize and test the same hypothesis

    Get PDF
    In this crowdsourced initiative, independent analysts used the same dataset to test two hypotheses regarding the effects of scientists’ gender and professional status on verbosity during group meetings. Not only the analytic approach but also the operationalizations of key variables were left unconstrained and up to individual analysts. For instance, analysts could choose to operationalize status as job title, institutional ranking, citation counts, or some combination. To maximize transparency regarding the process by which analytic choices are made, the analysts used a platform we developed called DataExplained to justify both preferred and rejected analytic paths in real time. Analyses lacking sufficient detail, reproducible code, or with statistical errors were excluded, resulting in 29 analyses in the final sample. Researchers reported radically different analyses and dispersed empirical outcomes, in a number of cases obtaining significant effects in opposite directions for the same research question. A Boba multiverse analysis demonstrates that decisions about how to operationalize variables explain variability in outcomes above and beyond statistical choices (e.g., covariates). Subjective researcher decisions play a critical role in driving the reported empirical results, underscoring the need for open data, systematic robustness checks, and transparency regarding both analytic paths taken and not taken. Implications for organizations and leaders, whose decision making relies in part on scientific findings, consulting reports, and internal analyses by data scientists, are discussed
    corecore