32 research outputs found

    Improving accuracy and power with transfer learning using a meta-analytic database

    Get PDF
    Typical cohorts in brain imaging studies are not large enough for systematic testing of all the information contained in the images. To build testable working hypotheses, investigators thus rely on analysis of previous work, sometimes formalized in a so-called meta-analysis. In brain imaging, this approach underlies the specification of regions of interest (ROIs) that are usually selected on the basis of the coordinates of previously detected effects. In this paper, we propose to use a database of images, rather than coordinates, and frame the problem as transfer learning: learning a discriminant model on a reference task to apply it to a different but related new task. To facilitate statistical analysis of small cohorts, we use a sparse discriminant model that selects predictive voxels on the reference task and thus provides a principled procedure to define ROIs. The benefits of our approach are twofold. First it uses the reference database for prediction, i.e. to provide potential biomarkers in a clinical setting. Second it increases statistical power on the new task. We demonstrate on a set of 18 pairs of functional MRI experimental conditions that our approach gives good prediction. In addition, on a specific transfer situation involving different scanners at different locations, we show that voxel selection based on transfer learning leads to higher detection power on small cohorts.Comment: MICCAI, Nice : France (2012

    Analysis of a large fMRI cohort: Statistical and methodological issues for group analyses.

    Get PDF
    International audienceThe aim of group fMRI studies is to relate contrasts of tasks or stimuli to regional brain activity increases. These studies typically involve 10 to 16 subjects. The average regional activity statistical significance is assessed using the subject to subject variability of the effect (random effects analyses). Because of the relatively small number of subjects included, the sensitivity and reliability of these analyses is questionable and hard to investigate. In this work, we use a very large number of subject (more than 80) to investigate this issue. We take advantage of this large cohort to study the statistical properties of the inter-subject activity and focus on the notion of reproducibility by bootstrapping. We asked simple but important methodological questions: Is there, from the point of view of reliability, an optimal statistical threshold for activity maps? How many subjects should be included in group studies? What method should be preferred for inference? Our results suggest that i) optimal thresholds can indeed be found, and are rather lower than usual corrected for multiple comparison thresholds, ii) 20 subjects or more should be included in functional neuroimaging studies in order to have sufficient reliability, iii) non-parametric significance assessment should be preferred to parametric methods, iv) cluster-level thresholding is more reliable than voxel-based thresholding, and v) mixed effects tests are much more reliable than random effects tests. Moreover, our study shows that inter-subject variability plays a prominent role in the relatively low sensitivity and reliability of group studies

    Structural analysis of fMRI data revisited: improving the sensitivity and reliability of fMRI group studies.

    Get PDF
    International audienceGroup studies of functional magnetic resonance imaging datasets are usually based on the computation of the mean signal across subjects at each voxel (random effects analyses), assuming that all subjects have been set in the same anatomical space (normalization). Although this approach allows for a correct specificity (rate of false detections), it is not very efficient for three reasons: i) its underlying hypotheses, perfect coregistration of the individual datasets and normality of the measured signal at the group level are frequently violated; ii) the group size is small in general, so that asymptotic approximations on the parameters distributions do not hold; iii) the large size of the images requires some conservative strategies to control the false detection rate, at the risk of increasing the number of false negatives. Given that it is still very challenging to build generative or parametric models of intersubject variability, we rely on a rule based, bottom-up approach: we present a set of procedures that detect structures of interest from each subject's data, then search for correspondences across subjects and outline the most reproducible activation regions in the group studied. This framework enables a strict control on the number of false detections. It is shown here that this analysis demonstrates increased validity and improves both the sensitivity and reliability of group analyses compared with standard methods. Moreover, it directly provides information on the spatial position correspondence or variability of the activated regions across subjects, which is difficult to obtain in standard voxel-based analyses

    Brainomics: A management system for exploring and merging heterogeneous brain mapping data

    Get PDF
    International audienceWe propose an open source solution to manage brain imaging datasets and associated meta data. This framework is a powerful querying and reporting tool, customized for the needs of the emerging imaging-genetics field. A demonstration website and more details are available at http:/brainomics.cea.fr

    Real-time control of photobleaching trajectory during photodynamic therapy

    Get PDF
    International audienceIntroduction: obstacles and challenges to the clinical use of the photodynamic therapy (PDT) are numerous: large inter-individual variability, heterogeneity of therapeutic predictability, lack of in vivo monitoring concerning the reactive oxygen species (ROS) production, etc. All of these factors affect in their ways the therapeutic response of the treatment and can lead to a wild uncertainty on its efficiency. Objective: to deal with these variability sources, we have designed and developed an innovative technology able to adapt in realtime the width of light impulses during the photodynamic therapy. The first objective is to accurately control the photobleaching trajectory of the photosensitizer during the treatment with a subsequent goal to improve the efficacy and reproducibility of this therapy.Methods: in this approach, the physician a priori defines the expected trajectory to be tracked by the photosensitizer photobleaching during the treatment. The photobleaching state of the PS is regularly measured during the treatment session and is used to change in real-time the illumination signal. This adaptive scheme of the photodynamic therapy has been implemented, tested and validated during in vitro tests.Results: these tests show that controlling the photobleaching trajectory is possible, confirming the technical feasibility of such an approach to deal with inter-individual variabilities in PDT. These results open new perspectives since the illumination signal can be different from a patient to another according to his individual response.Conclusions: this study has proven its interest by showing promising results in an in vitro context, which has to be confirmed by the current in vivo experiments. However, it is fair to say that in a near future, the proposed solution could lead, in fine, to an optimized and personalized PDT

    Fast reproducible identification and large-scale databasing of individual functional cognitive networks

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Although cognitive processes such as reading and calculation are associated with reproducible cerebral networks, inter-individual variability is considerable. Understanding the origins of this variability will require the elaboration of large multimodal databases compiling behavioral, anatomical, genetic and functional neuroimaging data over hundreds of subjects. With this goal in mind, we designed a simple and fast acquisition procedure based on a 5-minute functional magnetic resonance imaging (fMRI) sequence that can be run as easily and as systematically as an anatomical scan, and is therefore used in every subject undergoing fMRI in our laboratory. This protocol captures the cerebral bases of auditory and visual perception, motor actions, reading, language comprehension and mental calculation at an individual level.</p> <p>Results</p> <p>81 subjects were successfully scanned. Before describing inter-individual variability, we demonstrated in the present study the reliability of individual functional data obtained with this short protocol. Considering the anatomical variability, we then needed to correctly describe individual functional networks in a voxel-free space. We applied then non-voxel based methods that automatically extract main features of individual patterns of activation: group analyses performed on these individual data not only converge to those reported with a more conventional voxel-based random effect analysis, but also keep information concerning variance in location and degrees of activation across subjects.</p> <p>Conclusion</p> <p>This collection of individual fMRI data will help to describe the cerebral inter-subject variability of the correlates of some language, calculation and sensorimotor tasks. In association with demographic, anatomical, behavioral and genetic data, this protocol will serve as the cornerstone to establish a hybrid database of hundreds of subjects suitable to study the range and causes of variation in the cerebral bases of numerous mental processes.</p

    Probabilistic anatomo-functional parcellation of the cortex: how many regions?

    No full text
    International audienceUnderstanding brain structure and function entails the inclusion of anatomical and functional information in a common space, in order to study how these different informations relate to each other in a population of subjects. In this paper, we revisit the parcellation model and explicitly combine anatomical features, i.e. a segmentation of the cortex into gyri, with a functional information under the form of several cortical maps, which are used to further subdivide the gyri into functionally consistent regions. A probabilistic model is introduced, and the parcellation model is estimated using a Variational Bayes approach. The number of regions in the model is validated based on cross-validation. It is found that about 250 patches of cortex can be delineated both in the left and right hemisphere based on this procedure

    Early subclinical left ventricular dysfunction post radiotherapy in breast cancer: speckle-tracking echocardiography detects dose-related longitudinal strain changes after irradiation (BACCARAT study)

    No full text
    International audienceBackground: Breast cancer radiotherapy (BC RT) can be associated with long-term, silent and lately detected cardiotoxicity. Longitudinal strain (LS), based on 2D speckle-tracking echocardiography (2DSTE) can detect early subclinical left ventricular dysfunction. Little is known on the association between RT-induced cardiac doses and LS changes. Purpose: To analyze the relationship between cardiac doses and LS changes after RT in breast cancer patients.Methods: BACCARAT is a monocentric prospective cohort study that included BC patients treated with RT without chemotherapy. Global LS, in particular in the mid-layer of the myocardium (GLS-mid), was measured with 2DSTE before RT and 6 months after RT. We evaluated radiation doses to whole heart, left ventricle (LV), left anterior descending coronary artery (LAD). A subclinical event of cardiotoxicity "decreased GLS-mid≥10% " was defined as a relative decrease≥10%. Results: We included 94 patients (18 right-sided BC, 76 left-sided BC) aged 59±8 years. Mean doses to the heart, LV and LAD were respectively: 3.0±1.3 Gy, 6.6±3.2 Gy and 16.5±7.5 Gyfor the left-sided patients; 0.6±0.5 Gy, 0.2±0.2 Gy and 0.3±0.5 Gy for the right-sided patients. Left-sided BC patients had a significant decrease of GLS-mid (%relative change=9.0%, p=0.02); "decreased GLS-mid≥10%" was detected in 50% of left-sided BC patients and 38% in the right-sided group. We observed dose-response relationships with mean doses to the heart, the LV and the LAD in univariate analysis, that remained marginally significant in adjusted analysis for mean LV dose with an increase of the subclinical event by 11% for each increase of 1 Gy (OR=1.11, 95%CI [0.98 – 1.26]). Conclusion: This is the first study to establish dose-response relationships between BC RT-induced cardiac doses and decreased LS 6 months after RT. With longer follow-up, it remains to be determined whether these dose-related changes are confirmed and if detection of this subclinical LV dysfunction is related to clinical outcome
    corecore