1,101 research outputs found

    A formal architecture-centric and model driven approach for the engineering of science gateways

    Get PDF
    From n-Tier client/server applications, to more complex academic Grids, or even the most recent and promising industrial Clouds, the last decade has witnessed significant developments in distributed computing. In spite of this conceptual heterogeneity, Service-Oriented Architecture (SOA) seems to have emerged as the common and underlying abstraction paradigm, even though different standards and technologies are applied across application domains. Suitable access to data and algorithms resident in SOAs via so-called ‘Science Gateways’ has thus become a pressing need in order to realize the benefits of distributed computing infrastructures.In an attempt to inform service-oriented systems design and developments in Grid-based biomedical research infrastructures, the applicant has consolidated work from three complementary experiences in European projects, which have developed and deployed large-scale production quality infrastructures and more recently Science Gateways to support research in breast cancer, pediatric diseases and neurodegenerative pathologies respectively. In analyzing the requirements from these biomedical applications the applicant was able to elaborate on commonly faced issues in Grid development and deployment, while proposing an adapted and extensible engineering framework. Grids implement a number of protocols, applications, standards and attempt to virtualize and harmonize accesses to them. Most Grid implementations therefore are instantiated as superposed software layers, often resulting in a low quality of services and quality of applications, thus making design and development increasingly complex, and rendering classical software engineering approaches unsuitable for Grid developments.The applicant proposes the application of a formal Model-Driven Engineering (MDE) approach to service-oriented developments, making it possible to define Grid-based architectures and Science Gateways that satisfy quality of service requirements, execution platform and distribution criteria at design time. An novel investigation is thus presented on the applicability of the resulting grid MDE (gMDE) to specific examples and conclusions are drawn on the benefits of this approach and its possible application to other areas, in particular that of Distributed Computing Infrastructures (DCI) interoperability, Science Gateways and Cloud architectures developments

    High-Density Diffuse Optical Tomography During Passive Movie Viewing: A Platform for Naturalistic Functional Brain Mapping

    Get PDF
    Human neuroimaging techniques enable researchers and clinicians to non-invasively study brain function across the lifespan in both healthy and clinical populations. However, functional brain imaging methods such as functional magnetic resonance imaging (fMRI) are expensive, resource-intensive, and require dedicated facilities, making these powerful imaging tools generally unavailable for assessing brain function in settings demanding open, unconstrained, and portable neuroimaging assessments. Tools such as functional near-infrared spectroscopy (fNIRS) afford greater portability and wearability, but at the expense of cortical field-of-view and spatial resolution. High-Density Diffuse Optical Tomography (HD-DOT) is an optical neuroimaging modality directly addresses the image quality limitations associated with traditional fNIRS techniques through densely overlapping optical measurements. This thesis aims to establish the feasibility of using HD-DOT in a novel application demanding exceptional portability and flexibility: mapping disrupted cortical activity in chronically malnourished children. I first motivate the need for dense optical measurements of brain tissue to achieve fMRI-comparable localization of brain function (Chapter 2). Then, I present imaging work completed in Cali, Colombia, where a cohort of chronically malnourished children were imaged using a custom HD-DOT instrument to establish feasibility of performing field-based neuroimaging in this population (Chapter 3). Finally, in order to meet the need for age appropriate imaging paradigms in this population, I develop passive movie viewing paradigms for use in optical neuroimaging, a flexible and rich stimulation paradigm that is suitable for both adults and children (Chapter 4)

    Machine Learning Patterns for Neuroimaging-Genetic Studies in the Cloud

    Get PDF
    International audienceBrain imaging is a natural intermediate phenotype to understand the link between genetic information and behavior or brain pathologies risk factors. Massive efforts have been made in the last few years to acquire high-dimensional neuroimaging and genetic data on large cohorts of subjects. The statistical analysis of such data is carried out with increasingly sophisticated techniques and represents a great computational challenge. Fortunately, increasing computational power in distributed architectures can be harnessed, if new neuroinformatics infrastructures are designed and training to use these new tools is provided. Combining a MapReduce framework (TomusBLOB) with machine learning algorithms (Scikit-learn library), we design a scalable analysis tool that can deal with non-parametric statistics on high-dimensional data. End-users describe the statistical procedure to perform and can then test the model on their own computers before running the very same code in the cloud at a larger scale. We illustrate the potential of our approach on real data with an experiment showing how the functional signal in subcortical brain regions can be significantly fit with genome-wide genotypes. This experiment demonstrates the scalability and the reliability of our framework in the cloud with a two weeks deployment on hundreds of virtual machines

    Efficient, Distributed and Interactive Neuroimaging Data Analysis Using the LONI Pipeline

    Get PDF
    The LONI Pipeline is a graphical environment for construction, validation and execution of advanced neuroimaging data analysis protocols (Rex et al., 2003). It enables automated data format conversion, allows Grid utilization, facilitates data provenance, and provides a significant library of computational tools. There are two main advantages of the LONI Pipeline over other graphical analysis workflow architectures. It is built as a distributed Grid computing environment and permits efficient tool integration, protocol validation and broad resource distribution. To integrate existing data and computational tools within the LONI Pipeline environment, no modification of the resources themselves is required. The LONI Pipeline provides several types of process submissions based on the underlying server hardware infrastructure. Only workflow instructions and references to data, executable scripts and binary instructions are stored within the LONI Pipeline environment. This makes it portable, computationally efficient, distributed and independent of the individual binary processes involved in pipeline data-analysis workflows. We have expanded the LONI Pipeline (V.4.2) to include server-to-server (peer-to-peer) communication and a 3-tier failover infrastructure (Grid hardware, Sun Grid Engine/Distributed Resource Management Application API middleware, and the Pipeline server). Additionally, the LONI Pipeline provides three layers of background-server executions for all users/sites/systems. These new LONI Pipeline features facilitate resource-interoperability, decentralized computing, construction and validation of efficient and robust neuroimaging data-analysis workflows. Using brain imaging data from the Alzheimer's Disease Neuroimaging Initiative (Mueller et al., 2005), we demonstrate integration of disparate resources, graphical construction of complex neuroimaging analysis protocols and distributed parallel computing. The LONI Pipeline, its features, specifications, documentation and usage are available online (http://Pipeline.loni.ucla.edu)

    A web-based system for statistical shape analysis in temporomandibular joint osteoarthritis

    Get PDF
    This study presents a web-system repository: Data Storage for Computation and Integration (DSCI) for Osteoarthritis of the temporomandibular joint (TMJ OA). This environment aims to maintain and allow contributions to the database from multi-clinical centers and compute novel statistics for disease classification. For this purpose, imaging datasets stored in the DSCI consisted of three-dimensional (3D) surface meshes of condyles from CBCT, clinical markers and biological markers in healthy and TMJ OA subjects. A clusterpost package was included in the web platform to be able to execute the jobs in remote computing grids. The DSCI application allowed runs of statistical packages, such as the Multivariate Functional Shape Data Analysis to compute global correlations between covariates and the morphological variability, as well as local p-values in the 3D condylar morphology. In conclusion, the DSCI allows interactive advanced statistical tools for non-statistical experts

    1st INCF Workshop on Sustainability of Neuroscience Databases

    Get PDF
    The goal of the workshop was to discuss issues related to the sustainability of neuroscience databases, identify problems and propose solutions, and formulate recommendations to the INCF. The report summarizes the discussions of invited participants from the neuroinformatics community as well as from other disciplines where sustainability issues have already been approached. The recommendations for the INCF involve rating, ranking, and supporting database sustainability

    The MNI data-sharing and processing ecosystem

    Get PDF
    AbstractNeuroimaging has been facing a data deluge characterized by the exponential growth of both raw and processed data. As a result, mining the massive quantities of digital data collected in these studies offers unprecedented opportunities and has become paramount for today's research. As the neuroimaging community enters the world of “Big Data”, there has been a concerted push for enhanced sharing initiatives, whether within a multisite study, across studies, or federated and shared publicly. This article will focus on the database and processing ecosystem developed at the Montreal Neurological Institute (MNI) to support multicenter data acquisition both nationally and internationally, create database repositories, facilitate data-sharing initiatives, and leverage existing software toolkits for large-scale data processing

    Hybrid ant colony system and genetic algorithm approach for scheduling of jobs in computational grid

    Get PDF
    Metaheuristic algorithms have been used to solve scheduling problems in grid computing.However, stand-alone metaheuristic algorithms do not always show good performance in every problem instance. This study proposes a high level hybrid approach between ant colony system and genetic algorithm for job scheduling in grid computing.The proposed approach is based on a high level hybridization.The proposed hybrid approach is evaluated using the static benchmark problems known as ETC matrix.Experimental results show that the proposed hybridization between the two algorithms outperforms the stand-alone algorithms in terms of best and average makespan values
    corecore