1,388,064 research outputs found

    Analysis framework for the J-PET scanner

    Get PDF
    J-PET analysis framework is a flexible, lightweight, ROOT-based software package which provides the tools to develop reconstruction and calibration procedures for PET tomography. In this article we present the implementation of the full data-processing chain in the J-PET framework which is used for the data analysis of the J-PET tomography scanner. The Framework incorporates automated handling of PET setup parameters' database as well as high level tools for building data reconstruction procedures. Each of these components is briefly discussed.Comment: 6 pages, 1 figur

    A Validation Framework for the Long Term Preservation of High Energy Physics Data

    Full text link
    The study group on data preservation in high energy physics, DPHEP, is moving to a new collaboration structure, which will focus on the implementation of preservation projects, such as those described in the group's large scale report published in 2012. One such project is the development of a validation framework, which checks the compatibility of evolving computing environments and technologies with the experiments software for as long as possible, with the aim of substantially extending the lifetime of the analysis software, and hence of the usability of the data. The framework is designed to automatically test and validate the software and data of an experiment against changes and upgrades to the computing environment, as well as changes to the experiment software itself. Technically, this is realised using a framework capable of hosting a number of virtual machine images, built with different configurations of operating systems and the relevant software, including any necessary external dependencies.Comment: Proceedings of a poster presented at CHEP 2013, Amsterdam, October 14-18 201

    Azorean agriculture efficiency by PAR

    Get PDF
    The producers always aspire at increasing the efficiency of their production process. However, they do not always succeed in optimizing their production. In the last years, the interest on Data Envelopment Analysis (DEA) as a powerful tool for measuring efficiency has increased. This is due to the large amount of data sets collected to better understand the phenomena under study, and, at the same time, to the need of timely and inexpensive information. The “Productivity Analysis with R” (PAR) framework establishes a user-friendly data envelopment analysis environment with special emphasis on variable selection and aggregation, and summarization and interpretation of the results. The starting point is the following R packages: DEA (Diaz-Martinez and Fernandez-Menendez, 2008) and FEAR (Wilson, 2007). The DEA package performs some models of Data Envelopment Analysis presented in (Cooper et al., 2007). FEAR is a software package for computing nonparametric efficiency estimates and testing hypotheses in frontier models. FEAR implements the bootstrap methods described in (Simar and Wilson, 2000). PAR is a software framework using a portfolio of models for efficiency estimation and providing also results explanation functionality. PAR framework has been developed to distinguish between efficient and inefficient observations and to explicitly advise the producers about possibilities for production optimization. PER framework offers several R functions for a reasonable interpretation of the data analysis results and text presentation of the obtained information. The output of an efficiency study with PAR software is self- explanatory. We are applying PAR framework to estimate the efficiency of the agricultural system in Azores (Mendes et al., 2009). All Azorean farms will be clustered into homogeneous groups according to their efficiency measurements to define clusters of “good” practices and cluster of “less good” practices. This makes PAR appropriate to support public policies in agriculture sector in Azores.N/

    MetaboLab - advanced NMR data processing and analysis for metabolomics

    Get PDF
    Background\ud Despite wide-spread use of Nuclear Magnetic Resonance (NMR) in metabolomics for the analysis of biological samples there is a lack of graphically driven, publicly available software to process large one and two-dimensional NMR data sets for statistical analysis.\ud \ud Results\ud Here we present MetaboLab, a MATLAB based software package that facilitates NMR data processing by providing automated algorithms for processing series of spectra in a reproducible fashion. A graphical user interface provides easy access to all steps of data processing via a script builder to generate MATLAB scripts, providing an option to alter code manually. The analysis of two-dimensional spectra (1H,13C-HSQC spectra) is facilitated by the use of a spectral library derived from publicly available databases which can be extended readily. The software allows to display specific metabolites in small regions of interest where signals can be picked. To facilitate the analysis of series of two-dimensional spectra, different spectra can be overlaid and assignments can be transferred between spectra. The software includes mechanisms to account for overlapping signals by highlighting neighboring and ambiguous assignments.\ud \ud Conclusions\ud The MetaboLab software is an integrated software package for NMR data processing and analysis, closely linked to the previously developed NMRLab software. It includes tools for batch processing and gives access to a wealth of algorithms available in the MATLAB framework. Algorithms within MetaboLab help to optimize the flow of metabolomics data preparation for statistical analysis. The combination of an intuitive graphical user interface along with advanced data processing algorithms facilitates the use of MetaboLab in a broader metabolomics context.\ud \u

    Monitoring the CMS strip tracker readout system

    Get PDF
    The CMS Silicon Strip Tracker at the LHC comprises a sensitive area of approximately 200 m2 and 10 million readout channels. Its data acquisition system is based around a custom analogue front-end chip. Both the control and the readout of the front-end electronics are performed by off-detector VME boards in the counting room, which digitise the raw event data and perform zero-suppression and formatting. The data acquisition system uses the CMS online software framework to configure, control and monitor the hardware components and steer the data acquisition. The first data analysis is performed online within the official CMS reconstruction framework, which provides many services, such as distributed analysis, access to geometry and conditions data, and a Data Quality Monitoring tool based on the online physics reconstruction. The data acquisition monitoring of the Strip Tracker uses both the data acquisition and the reconstruction software frameworks in order to provide real-time feedback to shifters on the operational state of the detector, archiving for later analysis and possibly trigger automatic recovery actions in case of errors. Here we review the proposed architecture of the monitoring system and we describe its software components, which are already in place, the various monitoring streams available, and our experiences of operating and monitoring a large-scale system

    The new object oriented analysis framework for H1

    Full text link
    During the years 2000 and 2001 the HERA machine and the H1 experiment performed substantial luminosity upgrades. To cope with the increased demands on data handling an effort was made to redesign and modernize the analysis software. Main goals were to lower turn-around time for physics analysis by providing a single framework for data storage, event selection, physics and event display. The new object oriented analysis environment based on the RooT framework provides a data access front-end for the new data storage scheme and a new event display. The analysis data is stored in four different layers of separate files. Each layer represents a different level of abstraction, i.e. reconstruction output, physics particles, event summary information and user specific information. Links between the layers allow correlating quantities of different layers. Currently, this framework is used for data analyses of the previous collected data and for standard data production of the currently collected data.Comment: Talk from the 2003 Computing in High energy and Nuclear Physics (CHEP 03), La Jolla, Ca. USA, March 2003, 3 pages, 1 eps figure, PSN THLT 00

    Multimodal analysis of synchronization data from patients with dementia

    Get PDF
    Little is known about the abilities of people with dementia to synchronize bodily movements to music. The lack of non-intrusive tools that do not hinder patients, and the absence of appropriate analysis methods may explain why such investigations remain challenging. This paper discusses the development of an analysis framework for processing sensorimotor synchronization data obtained from multiple measuring devices. The data was collected during an explorative study, carried out at the University Hospital of Reims (F), involving 16 individuals with dementia. The study aimed at testing new methods and measurement tools developed to investigate sensorimotor synchronization capacities in people with dementia. An analysis framework was established for the extraction of quantity of motion and synchronization parameters from the multimodal dataset composed of sensor, audio, and video data. A user-friendly monitoring tool and analysis framework has been established and tested that holds potential to respond to the needs of complex movement data handling. The study enabled improving of the hardware and software robustness. It provides a strong framework for future experiments involving people with dementia interacting with music

    MITK-ModelFit: A generic open-source framework for model fits and their exploration in medical imaging -- design, implementation and application on the example of DCE-MRI

    Full text link
    Many medical imaging techniques utilize fitting approaches for quantitative parameter estimation and analysis. Common examples are pharmacokinetic modeling in DCE MRI/CT, ADC calculations and IVIM modeling in diffusion-weighted MRI and Z-spectra analysis in chemical exchange saturation transfer MRI. Most available software tools are limited to a special purpose and do not allow for own developments and extensions. Furthermore, they are mostly designed as stand-alone solutions using external frameworks and thus cannot be easily incorporated natively in the analysis workflow. We present a framework for medical image fitting tasks that is included in MITK, following a rigorous open-source, well-integrated and operating system independent policy. Software engineering-wise, the local models, the fitting infrastructure and the results representation are abstracted and thus can be easily adapted to any model fitting task on image data, independent of image modality or model. Several ready-to-use libraries for model fitting and use-cases, including fit evaluation and visualization, were implemented. Their embedding into MITK allows for easy data loading, pre- and post-processing and thus a natural inclusion of model fitting into an overarching workflow. As an example, we present a comprehensive set of plug-ins for the analysis of DCE MRI data, which we validated on existing and novel digital phantoms, yielding competitive deviations between fit and ground truth. Providing a very flexible environment, our software mainly addresses developers of medical imaging software that includes model fitting algorithms and tools. Additionally, the framework is of high interest to users in the domain of perfusion MRI, as it offers feature-rich, freely available, validated tools to perform pharmacokinetic analysis on DCE MRI data, with both interactive and automatized batch processing workflows.Comment: 31 pages, 11 figures URL: http://mitk.org/wiki/MITK-ModelFi
    corecore