94 research outputs found

    The past, present, and future of the Brain Imaging Data Structure (BIDS)

    Get PDF
    The Brain Imaging Data Structure (BIDS) is a community-driven standard for the organization of data and metadata from a growing range of neuroscience modalities. This paper is meant as a history of how the standard has developed and grown over time. We outline the principles behind the project, the mechanisms by which it has been extended, and some of the challenges being addressed as it evolves. We also discuss the lessons learned through the project, with the aim of enabling researchers in other domains to learn from the success of BIDS

    Imaging Monoaminergic Systems and their Pharmacological Control

    No full text
    The natural sciences always and only enhance the human condition by producing knowledge which empowers us to control the natural world, where otherwise it would control us. Nowhere is the power of uncontrollable natural phenomena to curb or limit human well-being more pervasive than in the human mind itself. Monoamines are a class of neurotransmitters consistently implicated in the etiology of nonvolitional neuropsychological phenomena. They are a cornerstone of those mental functions which humans most desire, but are least able to control. Unsurprisingly, drugs targeting these neurotransmitter systems are widely used in clinical, therapeutic, performance-enhancing, and recreational contexts. To the detriment of patients and users, however, currently available drugs are strongly lacking in terms of effect amplitude, reliability, and persistence, as well as suitability for long-term use. We present novel research, which advances the descriptive understanding of drug-naïve monoaminergic function and of monoaminergic drug effects. Our work includes methods development, the investigation of functional monoaminergic neurophenotypes, and the imaging-based profiling of longitudinal drug treatment. The neurobiological representations we put forward are instrumental to refining the understanding of psychopharmacological intervention profiles and the phenomena which they are able to modulate. Methodologically, we tackle technological impediments to large-scale (longitudinal, multi-cohort, and multi-center) preclinical brain imaging. Our first article deals with the challenge of automatically and reliably preparing preclinical magnetic resonance imaging (MRI) data for sharing and analysis. Our second article deals with the improvement of mouse brain registration and the definition of a reference space. Both of the above, as well as further relevant data analysis, rely heavily on high-level software tools. We consequently make an excursion into neuroscientific software management, which needs to be as accessible, reproducible, and transparent as the research it supports. In our third article we present the first whole-brain read-out of ventral tegmental area (VTA) dopaminergic signalling in the mouse. We perform a multivariate analysis of experiment parameters, and formulate specific guidelines for assay reuse or refinement. In our fourth article we apply a previously established serotonergic activity read-out to a longitudinal selective serotonin reuptake inhibitor (SSRI) drug treatment. We produce the first functional neuroimaging profile of longitudinal serotonergic drug effects, and we identify distinct brain clusters based on longitudinal activation trajectories. Our findings both support the autoinhibition down-regulation theory for the SSRI action mechanism, and complement it, by suggesting a prominent role for brainstem involvement. As all trajectories show significant treatment but no post-treatment effects, we also provide neuroimaging evidence strongly suggesting that the intervention fails to elicit persistent homeostatic shifts in healthy subjects. We openly share all acquired data, and all code required to reproduce our analyses. We suggest that the novel methods and the neurophenotypical profiling concept which we put forward may revitalize psychopharmacological research. Our work ultimately serves to advance the understanding of monoaminergic function and its manipulation, as is needed to fulfill the increasing need for the betterment of the human mind, in and outside of the clinical context

    An Optimized Registration Workflow and Standard Geometric Space for Small Animal Brain Imaging

    Get PDF
    The reliability of scientific results critically depends on reproducible and transparent data processing. Cross-subject and cross-study comparability of imaging data in general, and magnetic resonance imaging (MRI) data in particular, is contingent on the quality of registration to a standard reference space. In small animal MRI this is not adequately provided by currently used processing workflows, which utilize high-level scripts optimized for human data, and adapt animal data to fit the scripts, rather than vice-versa. In this fully reproducible article we showcase a generic workflow optimized for the mouse brain, alongside a standard reference space suited to harmonize data between analysis and operation. We present four separate metrics for automated quality control (QC), and a visualization method to aid operator inspection. Benchmarking this workflow against common legacy practices reveals that it performs more consistently, better preserves variance across subjects while minimizing variance across sessions, and improves both volume and smoothness conservation RMSE approximately 3-fold. We propose this open source workflow and the QC metrics as a new standard for small animal MRI registration, ensuring workflow robustness, data comparability, and region assignment validity, important criteria for the comparability of scientific results across experiments and centers

    LabbookDB Presentation - A Relational Framework for Laboratory Metadata

    No full text
    LabbookDB is a relational database application framework for life sciences—providing an extendable schema and functions to conveniently add and retrieve information, and generate summaries. The core concept of LabbookDB is that wet work metadata commonly tracked in lab books or spreadsheets is more efficiently and more reliably stored in a relational database, and more flexibly queried. We overcome the flexibility limitations of designed-for-analysis spreadsheets and databases by building our schema around atomized physical object interactions in the laboratory (and providing plotting- and/or analysis-ready dataframes as a compatibility layer). We keep our database schema more easily extendable and adaptable by using joined table inheritance to manage polymorphic objects and their relationships. LabbookDB thus provides a wet work metadata storage model excellently suited for exploratory ex-post reporting and analysis, as well as a potential infrastructure for automated wet work tracking

    An Automated Open-Source Workflow for Standards-Compliant Integration of Small Animal Magnetic Resonance Imaging Data

    Get PDF
    Large-scale research integration is contingent on seamless access to data in standardized formats. Standards enable researchers to understand external experiment structures, pool results, and apply homogeneous preprocessing and analysis workflows. Particularly, they facilitate these features without the need for numerous potentially confounding compatibility add-ons. In small animal magnetic resonance imaging, an overwhelming proportion of data is acquired via the ParaVision software of the Bruker Corporation. The original data structure is predominantly transparent, but fundamentally incompatible with modern pipelines. Additionally, it sources metadata from free-field operator input, which diverges strongly between laboratories and researchers. In this article we present an open-source workflow which automatically converts and reposits data from the ParaVision structure into the widely supported and openly documented Brain Imaging Data Structure (BIDS). Complementing this workflow we also present operator guidelines for appropriate ParaVision data input, and a programmatic walk-through detailing how preexisting scans with uninterpretable metadata records can easily be made compliant after the acquisition

    Whole-brain opto-fMRI map of mouse VTA dopaminergic activation reflects structural projections with small but significant deviations

    No full text
    Ascending dopaminergic projections from neurons located in the Ventral Tegmental Area (VTA) are key to the etiology, dysfunction, and control of motivation, learning, and addiction. Due to the evolutionary conservation of this nucleus and the extensive use of mice as disease models, establishing an assay for VTA dopaminergic signaling in the mouse brain is crucial for the translational investigation of motivational control as well as of neuronal function phenotypes for diseases and interventions. In this article we use optogenetic stimulation directed at VTA dopaminergic neurons in combination with functional Magnetic Resonance Imaging (fMRI), a method widely used in human deep brain imaging. We present a comprehensive assay producing the first whole-brain opto-fMRI map of dopaminergic activation in the mouse, and show that VTA dopaminergic system function is consistent with its structural VTA projections, diverging only in a few key aspects. While the activation map predominantly highlights target areas according to their relative projection densities (e.g., strong activation of the nucleus accumbens and low activation of the hippocampus), it also includes areas for which a structural connection is not well established (such as the dorsomedial striatum). We further detail the variability of the assay with regard to multiple experimental parameters, including stimulation protocol and implant position, and provide evidence-based recommendations for assay reuse, publishing both reference results and a reference analysis workflow implementation.ISSN:2158-318

    The Software Effect on the Quality of the Financial Activity of the Companies

    No full text
    Organizations now have the opportunity to fulfill most of their information system requirements through software packages, rather than building bespoke solutions. The advantages of this approach, particularly the perceived cost and time savings, appear self-evident. However, there are disadvantages, which must be carefully understood and evaluated before selecting and purchasing a software package. The organization must also be aware that it is probably entering a long-term commercial relationship with a supplier. It may be costly and difficult to end such a relationship, as converting data from one product to another may be prohibitively expensive. Hence the risks of the software package approach must be identified and appropriate risk avoidance and mitigation actions developed.software packages, time savings, risk, financial activity
    corecore