4 research outputs found
A new tool for equating lexical stimuli across experimental conditions
In cognitive psychology and psycholinguistics, lexical characteristics can drive large effects, which can create confounds when word stimuli are intended to be unrelated to the effect of interest. Thus, it is critical to control for these potential confounds. As an alternative to randomly assigning word bank items to stimulus lists, we present LIBRA (Lexical Item Balancing & Resampling Algorithm), a MATLAB-based toolbox for quickly generating stimulus lists of user-determined length and number that can be closely equated on any number of lexical properties. The toolbox comprises two scripts: a genetic algorithm that performs the inter-list balancing, and a tool for filtering/trimming long omnibus word lists based on simple criteria, prior to balancing. Relying on randomized procedures often results in substantially unbalanced experimental conditions, but our method guarantees that the lists used for each experimental condition contain no meaningful differences. Thus, the lexical characteristics of the specific words used will add an absolute minimum of bias/noise to the experiment in which they are applied. â˘Our toolbox balances word lists for arbitrary lexical properties to control confounds in cognitive psychology research. â˘Our toolbox performs more efficiently than pure randomization or balancing manually. â˘A graphical user interface is provided for ease of use
Variability in the analysis of a single neuroimaging dataset by many teams
Data analysis workflows in many scientific domains have become increasingly complex and flexible. Here we assess the effect of this flexibility on the results of functional magnetic resonance imaging by asking 70 independent teams to analyse the same dataset, testing the same 9 ex-ante hypotheses1. The flexibility of analytical approaches is exemplified by the fact that no two teams chose identical workflows to analyse the data. This flexibility resulted in sizeable variation in the results of hypothesis tests, even for teams whose statistical maps were highly correlated at intermediate stages of the analysis pipeline. Variation in reported results was related to several aspects of analysis methodology. Notably, a meta-analytical approach that aggregated information across teams yielded a significant consensus in activated regions. Furthermore, prediction markets of researchers in the field revealed an overestimation of the likelihood of significant findings, even by researchers with direct knowledge of the dataset2â5. Our findings show that analytical flexibility can have substantial effects on scientific conclusions, and identify factors that may be related to variability in the analysis of functional magnetic resonance imaging. The results emphasize the importance of validating and sharing complex analysis workflows, and demonstrate the need for performing and reporting multiple analyses of the same data. Potential approaches that could be used to mitigate issues related to analytical variability are discussed.Depto. de PsicobiologĂa y MetodologĂa en Ciencias del ComportamientoFac. de PsicologĂaTRUEpu
Variability in the analysis of a single neuroimaging dataset by many teams
: Data analysis workflows in many scientific domains have become increasingly complex and flexible. Here we assess the effect of this flexibility on the results of functional magnetic resonance imaging by asking 70 independent teams to analyse the same dataset, testing the same 9 ex-ante hypotheses1. The flexibility of analytical approaches is exemplified by the fact that no two teams chose identical workflows to analyse the data. This flexibility resulted in sizeable variation in the results of hypothesis tests, even for teams whose statistical maps were highly correlated at intermediate stages of the analysis pipeline. Variation in reported results was related to several aspects of analysis methodology. Notably, a meta-analytical approach that aggregated information across teams yielded a significant consensus in activated regions. Furthermore, prediction markets of researchers in the field revealed an overestimation of the likelihood of significant findings, even by researchers with direct knowledge of the dataset2-5. Our findings show that analytical flexibility can have substantial effects on scientific conclusions, and identify factors that may be related to variability in the analysis of functional magnetic resonance imaging. The results emphasize the importance of validating and sharing complex analysis workflows, and demonstrate the need for performing and reporting multiple analyses of the same data. Potential approaches that could be used to mitigate issues related to analytical variability are discussed